Quantcast
Channel: Articles
Viewing all 43 articles
Browse latest View live

How to allow the vCO Appliance to write files to a Windows CIFS share

$
0
0
A frequent question around vCenter Orchestrator is: Can I read/write to shared folders? The answer is always yes, but the documentation may be difficult to find - especially when you are referring to a Windows share - aka CIFS in samba and *nix mounting terminology. Read on to learn EXACTLY what you need to do to allow your vCO server to read/write to a Windows Share!

Overview:

vCO allows limited access to its local filesystem. The access is determined by the js-io-rights.conf file found in /etc/vco/app-server/ folder on the appliance. The file should contain some default settings that allow the server to perform its daily operations. If you wish to allow vCO to read/write to the local system, no problem, follow the syntax in that file to add access to a folder named /orchestrator for example with the following line:
+rwx /orchestrator/
When making changes to this configuration file, you must restart the vCO Server service in order for the changes to take effect.

Now how about network shares? Well, don’t consider using UNC paths - mount the remote filesystem to a local folder on the appliance so that you have a local path that may be added to your js-io-rights.conf file. When mounting that remote filesystem, things to consider:
Access rights - restricted per user? Guest access? Read Only? Read/Write?
Mount type? NFS, CIFS, etc…
This article’s focus is on CIFS since that seems to present the biggest challenge to vCO users.

POC Use-Case:

Client has a Windows server (in a domain) with a folder shared out to a service account to be used by automation systems for placing files
Provide step-by-step instructions on how to configure and test vCO’s ability to write file(s) to a CIFS share provided by the Windows server

The following steps were performed using a file share created on a Windows 2008 R2 Domain controller and a vCO Server 5.5.1 GA Virtual Appliance:

Windows File Server:

  1. Create a shared folder named “cifs-share”  (whatever name as you desire, just using “cifs-share” as example for this article for consistency)
  2. Grant a user read/write access to the share (in this case, we are using a domain user named ldev with a password of VMware1!)

vCO Appliance:

  1. Create empty folder in /mnt for the CIFS mount in the vco home directory: i (note: these steps could instead use /mnt/cifs-share if desired)chmod the folder to 755: chmod 755 /var/lib/vco/cifs-share
  2. Update /etc/fstab to automatically mount the cifs share:
    //192.168.110.10/cifs-share /var/lib/vco/cifs-share     cifs     username=ldev,password=VMware1!,rw,file_mode=0777,dir_mode=0777     0 0

    NOTE:
    This cifs mount must be referenced by IP address. FQDN does not work.
    The password may need to be escaped with quotes if it contains spaces. Additional, optional parameters to the above line could include:
    nounix,iocharset=utf8,sec=ntlmv2
  3. Modify js-io-rights.conf to allow Read/Write/Execute for the folder:
    +rwx /var/lib/vco/cifs-share

    NOTE: Be careful of the ordering of lines in this file as they are read in order. (This means you may need to re-arrange the lines if the permission doesn't work)
  1. Restart vCO Server service: 
    service vco-server restart
  2. Test: Use the “Export logs and application settings” Library workflow to confirm vCO is able to write to the folder. When launching the workflow, specify the full path to the mounted folder. In this example, that would be /var/lib/vco/cifs-share
Exported Logs And Application Settings
This screenshot above shows the input for the library workflow. This particular workflow uses a number of calls to export files and generate a zip file to the specified folder. In this case that folder happens to be a CIFS share that was mounted to the vCO appliance. Take a look inside the code of this workflow to learn more about how the code works.

vCO generated files in CIFS Share
The screenshot above shows the cifs-share on the Windows File server has several files that were written by the vCO server directly to this share via the mount that was configured.

vCO Exported files have desired owner set
We see here in the screenshot above that each of the files "owner" is the "ldev" account that we configured vCO to use when connecting to the
share.

Summary:

As shown above, it only takes a few minutes of configuration and a service restart to get vCO to a point that it is capable of reading and writing files to a remote filesystem via a CIFS share. Now that the permissions and configuration are in place, you can use the FileWriter and other vCO features to access files on such shares.

Have fun!

VMware released vCenter Orchestrator 5.5.1

$
0
0
This week VMware released vCenter Orchestrator 5.5.1. While this is not a major new release it contains a lot of new features focusing on making vCO content development easier. Also important is a list of deprecated features.

Protocol plug-ins are now included

The days where vCO was mainly used to automate vCenter are over. vCO integrates with many third party applications using generic protocol based plug-ins. In previous releases you had to find the latest release, download the plug-ins and install these. You will not have to worry about this anymore.

The main plug-ins that are shipped with the vCO platform are SSH, SQL, mail, vCenter. Starting with vCO 5.5.1 they all have been bumped to version 5.5.1[plug-in build number]

Plug-Ins included with vCenter Orchestrator 5.5.1 (that were not in 5.5.0)

  • vCenter Orchestrator SOAP Plug-In 1.0.3
  • vCenter Orchestrator HTTP-REST Plug-In 1.0.3
  • vCenter Orchestrator Plug-In for Microsoft Active Directory 1.0.4
  • vCenter Orchestrator AMQP Plug-In 1.0.3
  • vCenter Orchestrator SNMP Plug-In 1.0.2
  • Dynamic Types 1.0.0 (experimental – requires 5.5.1 U1 build)


NB: vCloud Automation Center uses an older build of vCO 5.5.1 that also include the same protocol plug-ins but also the PowerShell and vCAC IaaS plug-ins.

The Dynamic Types plug-in requires the at least the build that was released this week (and likely will require a newer build when released as final).


Tagging

You may already be familiar with tags in vSphere. It is a convenient way to set a key / value property on an object so query can be used to list the objects with specific tags.

vCO 5.5.1 supports at the javaScript and REST API level methods to tag any vCO object with a string from 3 to 64 characters. If you look for "tag" in the API explorer you will notice 16 methods to create, find, query tags.

Tags can be 'global' or 'private':
  • Global tags - visible by all users.
  • Private tags - visible only by the user created the tag.

Use case example : Find workflows by tag (i.e “remediation”, “presentation”). This is the first step of the implementation. It is likely that newer version of vCO will allow to perform queries from the vCO client but also from the vCO worklfow consumers such as the vSphere web client and vCAC.

Tags set on objects in packages can be exported & imported to another vCO server.


Dynamic Types

The objects you see in the vCO inventory (such as VM, datastores) are the one you can use as inputs of your workflows. These are coming from the vCO plug-ins. They are statically defined by a plug-in developer using the freely available Plug-in SDK. The plug-in developer is typically a software engineer with java / eclipse / web services skills.


The Dynamic Types allow to define new types without installing a new plug-in. Basically the plug-in can be implemented by a System engineer with scripting experience by creating workflows that are creating objects from making calls to the API of the system to orchestrate.

vCloud Automation Center allows you to provision and run day two operations on any service (XaaS) as long as you have a vCO inventory object that you can associate to a service blueprint. With leveraging the Dynamic Types it is possible to create service blueprints for any service that can be orchestrated via an API.

Technically the plug-ins provides workflows to help you author the plug-in:



Once completed creating objects and their relations the inventory will list the types hierarchy:




For each object you define you will need to implement 4 workflows (FindById, FindAll, FindRelation, HasChildrenInRelation). Once completed you will be able to browse your inventory.



Then you will need to implement  Create, Update, Delete workflows on each of this objects.
The effort for creating such a plug-in is proportional to the number of objects you need to handle.

NB: DynamicTypes plug-in released in 5.5.1 this week is in beta quality so developers can start to get familiar with it but is not supported yet for production use.


REST API

There are a few REST API new calls in 5.5.1

One noticeable new features is that you can start a vCO action through the REST API:
  • POST /actions/{actionId}/executions Runs an action with given ID
  • POST /actions/{categoryName}/{actionName}/executions Runs an action with given category and name
You may wonder what is the use case since you can start a workflow that starts a particular action. A workflow is a transactional process aiming at reliability, having features such as audit logs. For the use case of needing an API call for a real time answer (i.e displaying a drop down when an end user click on a web user interface) responsiveness is a more important feature. Running an action through the REST API is a lot more responsive than running a workflow.

[Sorry but after checking the feature above was not included in the 5.5.1 GA and may come in a later release]


You can now delete a folder / category using the REST API:
  • DELETE /categories/{id} Deletes a category with a given ID
And of course the new tagging functionality also provides a REST API so the feature can be leveraged from an external application :
  • GET /tags Retrieves list of tag owners.
  • GET /tags/{owner} Retrieves list of tag created by specific user.
  • DELETE /tags/{owner} Removes all tags created by specified owner.
  • GET /tags/{owner}/{tagName} Retrieves list of tag instance created by specific user.

Expand to folder

If, as a vCO developer, you need to put in a version control tool the content of your package you now have an option to expand a package to a folder.



When you do this all the elements included in a package will be unarchived in their raw XML format so they can be checked in in the versioning system.




Plug-in Configuration API

If you use the plug-in SDK you have now access to facilities that will store the plug-in configuration for you. In previous releases you had to implement saving the plug-in configuration in a file or a resource or a database, use a given format such as XML or JSON.
With using these facilities the configuration will be stored in a vCO configuration element stored in the vCO database making it available to all the vCO nodes. This eliminates the requirement of replicating file based plug-in configuration files across all the vCO nodes.



SDK in Appliance

If you are using the plug-in SDK you now have the ability to leverage a maven repository from the appliance.
This repository contains project files and dependencies required to build a vCO plug-in.


The instructions are included on the vCO appliance page:


Deprecated features

These features are still available in vCO but will be removed in upcoming releases.
  • vCenter Orchestrator 32-bit client: XP days are over. You will have to use a 64 bit client.
  • vCenter Orchestrator Web Views: You should use the REST API for your custom web design requirements or the vSphere web client for infrastructure operators or the vCloud Automation Center portal for end users.
  • vCenter Orchestrator Simple Object Access Protocol (SOAP) service API: The REST API is the way to go, much more powerful than the SOAP one.
  • vCenter Orchestrator Configuration Interface: This is aimed for the plug-in developers: new plug-ins configuration should be driven by workflows as it is now the case for the VMware authored plug-ins. This allows for example to be able to configure a plug-in from the vCloud Automation Center interface. vCO configuration is now possible through the vCO configuration plug-in.


How to get it

  • You can check the release notes here.
  • You can download vCO here.
  • You can get the vCO documentation here.






How To Configure vCAC's Embedded vCO To Allow Domain Account Login

$
0
0

If you're reading this article, it may be because you have installed vCloud Automation Center (vCAC) and are interested in using an account other than administrator@vsphere.local to login to the embedded vCenter Orchestrator (vCO) server. By default, the vCO Server uses a "vcoadmins" group in the "vsphere.local" domain provided by the SSO server that vCAC was configured to use. This short tutorial will step you through a pretty basic configuration where I have just deployed a vCAC 6.x appliance and wish to use my domain account for vCO login.

 Prepare Active Directory

This article assumes the use of Active Directory since we are attempting to allow a Domain account to login to the vCO client. As such, we need a group for our vCO Administrators.

  • Create a vCO Administrators group (vcoadmins for example) in Active Directory
  • Add domain accounts that you wish to allow to login to vCO to that group

Start the vCO Configuration Service

start_the_vco_configuration_service.png

Once an AD Group has been defined for this use, the vCO server must be reconfigured to use that group. The configuration service is not running by default on the vCAC appliance.

  • SSH to your vCAC Appliance
  • Start the vCO Configuration service by issuing the following command:
  • service vco-configurator start
  • Once the service has started, the link shown in the above screenshot will work. It points to https://vcac-appliance-url:8283

Login To vCO Configuration Page

  • Login to the vCO Configuration page using the link from your vCAC Appliance
  • At first login, you use the username vmware with password vmware
  • You will be prompted at first login to change the password

Change Authentication

change_authentication.png

  1. Once you have logged in, click on the "Authentication" tab in the left pane
  2. Click the drop-down next to "vCO Admin - domain and group"
  3. You should see the list of vsphere.local groups and a list of groups from the domain that SSO is tied to - choose the domain "vcoadmins" group you created earlier
  4. Click the "Update Orchestrator Configuration" button - confirm that the top of the window displays "Orchestrator configuration is successfully saved"

Restart vCenter Orchestrator Server Service

restart_vcenter_orchestrator_server_service.png

Now that you have reconfigured the group for your vCO Administrators, you must restart the vCO Server Service in order for the changes to be applied:

  1. Click the "Startup Options" tab on the left
  2. Click the "Restart service" link on the right
  3. After a few moments, the "Server is restarted" message should appear on the page

Wait another two minutes or so before attempting to login using your vCO Client

Login to vCO Client Using Domain Account

login_to_vco_client_using_domain_account.png

As you can see above, I am now able to login to my vCAC embedded vCO Server using my domain credentials!

Enabling VMware vCloud Automation Center XaaS with vCO dynamic types

$
0
0

You may have noticed that the vCO 5.5.1 release notes are listing a new feature called "Dynamic Types"

"Workflow developers are now able to explore the new Dynamic Types which currently is being shipped with Beta quality. They can easily extend vCenter Orchestrator plug-ins by adding their custom types accessible from the scripting API. New types become available in the inventory right after creation and they could be directly leveraged from the vCAC ASD context as part of the cloud provisioning process and XaaS definition."

 This article explores how the Dynamic Types can be leveraged to create your own mini vCO plug-ins without having any Java development skills.

 

vCloud Automation XaaS (Anything as a Service)

One of the most powerful vCloud Automation Center feature is XaaS (Anything as a Service). It gives the ability to request, approve, provision, operate, and decommission any type of catalog items (I.E storage, applications, accounts, anything you can provide as a service).

The vCloud Automation Service Architect designs a service blueprint calling a provisioning workflow outputting a custom resource. This custom resource is mapped to a vCO inventory object item referencing the orchestrated system entity. It can then been used as an input of the workflows defining the operations available for this item and for the decommissioning workflow.

When not used contextually an inventory item is presented to the end user as an input field with a tree view or a list view selection.

An inventory item type is statically defined with a type name (I.E VirtualMachine), a list of properties (I.E name, state) and is prefixed with the vCO plug-in name (I.E VC:VirtualMachine). This definition is contained in the plug-in description file (VSO.XML). The more vCO plug-ins are installed the more inventory type vCO will be able to manage.

It is possible to get additional plug-ins from VMware and partners with using the VMware solution exchange or by creating a plug-in with the Java based plug-in SDK.

If there is not out of the box vendor plug-in and if resources / Java development skills are not available to develop a plug-in it is still possible to rapidly design workflows to orchestrate systems exposing a web service using the SOAP or REST plug-in. These will allow pulling remote systems entities information and operating them as part of the vCloud Automation Center Infrastructure as a Service lifecycle. It is great but then it is missing the inventory object required by vCAC XaaS catalog items and day two operations.

Dynamic Types is a new vCO feature shipped starting with vCO 5.5.1 (experimental) allowing creating inventory types dynamically without doing any Java development. It brings together the quick implementation of the REST / SOAP plug-ins with the convenience of using inventory objects leveraged by vCAC XaaS.

This article

  • Describe the Dynamic Types toolbox provided by VMware
  • Explains the concepts involved in creating a plug-in using the DynamicTypes
  • Step you through the plug-in generator package I created to accelerate dynamic types plug-ins development time.

 

The Dynamic Types toolbox

A set of workflows are provided as part of the Dynamic Types plug-in to help design a plug-in:

  • Define / Update / Remove Namespace
  • Define / Update / Remove Type
  • Define / Remove Relation
  • Export / Import configuration / type definitions

Here us the complete list of these workflows :



The sequence is to create a namespace in which types are created and relations between these types are established.

Using these workflows will update the Type Hierarchy shown in the vCO inventory under "Dynamic Types" and also the namespace inventory of the objects if there are any. Below is an example of a Type Hierarchy:




The plug-in configuration is written in the config resource element in the Library/Dynamic Types folder. A JSON editor can be used to edit the exported config resource. It is then possible to create / update / delete namespaces, types and relations by updating the resource element. It is recommended to make a copy of this file every time the resource file is exported to be able to roll back to the last working version in case of an unwanted configuration change.

Below is an example of a plug-in configuration file displayed in a JSON editor. There are several JSON viewers editors available online.




The export / import functionality is a convenient way to save the plug-in configuration in a resource file that can be included in a package so the plug-in configuration can be loaded on another vCO server.

 

The concepts for building a Dynamic Type plug-in

Building a Dynamic Type plug-ins requires understanding the vCO elements that have to be created and the back and forth communication between vCO and the remote system.

 

Namespace

The first and very simple step is to create a namespace. It is the root element of the plug-in and defines the prefix of the inventory objects. You can run the “Define namespace” workflow. For example I created an iTop namespace to manage the Combodo iTop application.

 

A REST or SOAP host

While it is possible to integrate with a remote system in different ways most applications offer a REST or SOAP web service. Running the “Add a new host” (SOAP or REST) workflow with selecting the right authentication method and the credentials. Refer to the API documentation to do this configuration and use a test operation to make sure the connection and authentication work as expected.

 

findAll, findById, findRelation

Each of the plug-in object type of the plug-in requires creating REST or SOAP operations to:

  • List all objects for a given type (findAll) : This is used by the list view drop down
  • Get an object for a specific id (findById) : This is used to resolve and display the object in an input field and between each step of a worklow. This operation requires having an id parameter that is represented as {id} in the operation string
  • List child objects of a specific parent object: This is used in the inventory tree view and is invoked when an inventory item is unfolded

To create these operations it is required to find the target system SOAP / REST API URL and use the workflows “Create a new operation” (REST / SOAP).

To invoke these operations and as a result create new inventory dynamic types objects it is required to create three vCO actions:

  • findAll :
    • Input = object type
    • Output = array of Dynamic objects
  • findById :
    • Input = object type and Id
    • Output = Dynamic object
  • findRelation :
    • Input = object type and Id
    • Output = array of Dynamic objects

Invoking the operations above will result on a string based HTTP response content. Typically this will be an XML or JSON (JavaScript Object Notation) string representation of the object including its properties.

These properties need to be extracted and passed as parameter of a makeObject() method.

vCO supports a programming language extension called E4X (ECMAScript for XML) that adds native XML support to JavaScript AND also does natively convert JSON to JavaScript object. This makes it convenient to extract the object properties without complex parsing.

The makeObject method has 2 mandatory parameters (ID and name) and optional object properties (using a vCO Properties() object).

Note that the properties of a DynamicTypes are of simple type. You cannot set a property with a javascript object.

There is one extra action that is needed as well : hasChildrenInRelation is used to define if an inventory object has children or not (represented in the tree view by a >) and calls findRelation when returning true. A simple implementation of hasChildrenInRelation can be to test if findRelation returns more than 0 objects.

It is also possible to use workflows instead of actions during development time to see if the workflows are called and run as expected. However it should not be considered for production for performance and reliability reasons. An action is several times faster to run than a workflow and a workflow needs to resolve the objects requiring calling the findById workflow. This may start endless loops.

Once the four actions for a given object type have been successfully tested it is possible to create the Dynamic Type object definition using the provided “Define Type” workflow. This consists in setting a type name, an icon, the object properties and the mapping to the four actions.

For presenting properly in the inventory you need to create objects that will show a folder type and create a relation where this folder object is the parent of the object you created. For this the findRelation action will return an array with a single object.

Once completed you should be able to:

  • Unfold the dynamic Types inventory and see under the namespace the folder object, unfold it and see a list of objects.
  • Use this type as an input / output parameter of a workflow. When dealing with an input seeing a list view with all the objects.

If findAll and findRelation are too network intensive and result in slow inventory updates it is possible to use the putInCache() / GetFromCache() methods provided as part of the plug-in to cache the array of objects to be returned. A cache expiration policy can be designed to avoid not updating the inventory for too long.

 

CRUD operations

Having an inventory object makes it easy to select and get properties. Creating, updating, deleting, calling a method on the object requires some additional work.

For each of this operation it is necessary to create the SOAP/REST matching operation. Updating, deleting, calling a method will require passing the object ID.

Once you have created one of this operation you need to create an action that:

  • Get the ID from the object that is an input parameter of the action (except for create)
  • Invoke the SOAP / REST operation with passing the ID
  • Getting the result / do error checking
  • Return the created / updated object
  • Invalidate the inventory when necessary (on update / delete)

 

The created actions should be provided as part of library workflows going with the plug-in.

After some time depending on the number of object that need to be implemented and the number of crud operations you can expect a result like this Combodo Itop plug-in implementation:

 

 

 

The plug-in generator package

This is my attempt to make the Dynamic type plug-in development as simple and as quick as possible. Depending on the orchestrated technology it is possible to create a plug-in without writing a single line of JavaScript code. Here is how it happened:

  • I implemented a first plug-in (iTop Combodo) including a lot of objects (close to 90) in an inventory including multiple levels of cached objects with using a single set of findAll, findById, findRelation, hasChildrenInRelation.
  • I have isolated all the re-usable scripting, replaced all the specific scripting by generic scripting plus configuration files.
  • From these building blocks I have created a plug-in generator workflow and tested it successfully with the Twitter API without having to write any specific scripting.
  • I improved it to make it more interactive, adding several steps to check on the data returned and tested it against the vCO API. This was less straight forward since some queries did not return the objects in the format I was expecting. Knowing this would be a common issue (there is not such thing as API consistency) I have implemented placeholders to replace the default actions I provide to invoke a request and to deserialize the objects returned by the queries.

At this moment it is designed to handle HTTP REST hosts returning JSON strings. Using XML is certainly possible using custom actions. Using SOAP requires more work but a lot of the scripting included should provide a good basis to someone willing to adapt this solution to SOAP.


You can find the Dynamic Types plug-in generator package from the VMware communities.

Why is it beta ?

You should use this technology in your labs and / or for developing and not in a productive environment yet.

Here are some of the known issue to be fixed in next releases:
  • vCAC 6.0.X cannot yet request a catalog item linked to a dynamicType via a resource action

On the DynamicType plug-in & vCO :
  • No way to refresh the inventory view when objects are updated / deleted
  • Duplicate objects when using more than a single namespace
  • Performance & stability to be improved




Dynamic Types tutorial : Implement your own Twitter plug-in without any scripting

$
0
0
In a previous article I have explained how the vCO dynamic Types allow to simplify the development of vCO plug-ins and how these are leveraged by VMware vCloud Automation Center XaaS. It is now time to experiment with creating a plugin with leveraging the Dynamic Types plug-in generator package.

Warning : You can do it without Java development experience and without having to write a single line of scripting !

As a first step download the Dynamic Types plug-in generator package from VMware communities.


The plug-in generator Tutorial: Twitter


In order to get familiar with building a plug-in we start using a service with a public API : Twitter.

Prerequisites:
  • A Twitter account
  • A Twitter application

To create a Twitter Application login with your twitter account to https://apps.twitter.com/
Enter an application name, description and URL (I.E your site). Twitter will the API key, secret and access tokens. These will be needed for your application to authenticate on twitter API.


Download the Dynamic Type plug-in generator package from here. Import it.

REST Host and namespace creation

Run the "Create a plug-in (namespace and matching REST host).
  • For URL use : https://api.twitter.com/1.1/
  • Only enter the proxy settings is you must use a proxy.
  • For Host authentication type select Oauth 1.0
From apps.twitter.com copy and paste from the API keys tab:
  • The API key in the Consumer key input field
  • The API secret in the Consumer secret input field
  • The Access token
  • The Access token secret

  • Enter Twitter for the namespace name

Plug-in type creation

The Twitter API (documented at https://dev.twitter.com/docs/api/1.1/) allows to list different objects including Tweets. We will use the Mentions timeline as an example.

Run the "Create a new plug-in type"
Enter the fields as below (You can use any icon resource you have uploaded in your vCO server, here I have uploaded a mentions.png).



Submit.

FindAll


The next screen requests to enter the URL to get all the mentions. According to the documentation it is statuses/mentions_timeline.json.
The second field defines what is the action that will invoke the REST operation. Since there is no need to use specific headers or pass additional credentials as parameters leave the default executeRequest action.





vCO will now attempt to invoke the REST operation and display its response content. If the REST host settings and findAll URL were entered correctly you should see a JSON string. Copy the full content of the response string and paste it in a JSON viewer or editor. Below is how this look. It shows the JSON object is an array[] containing objects {}0, {}1 and so on. Each object has a set of properties. We can notice there are two different ID properties and that the Tweet content is in the text property. There are also some properties that are objects such as user or entities.





If you have got similar valid response content set "Valid response content" to yes. Otherwise set it to no. In this case you will be asked to set the URL and invoke action again. In the case there is a problem with the REST host settings you can cancel the workflow, right click Run Workflow and start the "Update a REST host" workflow. Once updated you can get back to the "Create a new plug-in type" workflow run, right click / answer to resume it.




vCO will now attempt to detect object properties. This will only work if the object format is standard (meaning an array of objects with each object having the same properties). This is the case for Twitter.
To create an object vCO needs two mandatory properties : id and name. If there are properties matching these names they will be used. If not the workflow will pick other properties. It is important to check that the ID is unique and that the name is meaningful since it will be shown in the inventory. The property accessors are basically the way the object will populate its properties accessing the returned object properties. It can be customized using a JSON editor.




Copy and paste the content of the findAll object properties accessors in a JSON editor.
We will focus out attention on the id and name properties:

id: object[\"id\"]
name: object.id

We can see that the object and name will be set using the id property (object[\"id\"] and object.id both refer to the value of the id property, the first one escapes the property name in case there is a special character).

Since we want to have something more meaningful for the Tweet name we will replace the name property value by:

object.user.name + " @" + object.user.screen_name + " " +  object.text

And convert it back to a JSON String as follows:
{
  "id": "object[\"id\"]",
  "name": "object.user.name + \" @\" + object.user.screen_name + \" \" +  object.text",
  "created_at": "object[\"created_at\"]",
  "id_str": "object[\"id_str\"]",
  "text": "object[\"text\"]",
  "source": "object[\"source\"]",
  "truncated": "object[\"truncated\"]",
  "in_reply_to_status_id": "object[\"in_reply_to_status_id\"]",
  "in_reply_to_status_id_str": "object[\"in_reply_to_status_id_str\"]",
  "in_reply_to_user_id": "object[\"in_reply_to_user_id\"]",
  "in_reply_to_user_id_str": "object[\"in_reply_to_user_id_str\"]",
  "in_reply_to_screen_name": "object[\"in_reply_to_screen_name\"]",
  "user": "object[\"user\"]",
  "geo": "object[\"geo\"]",
  "coordinates": "object[\"coordinates\"]",
  "place": "object[\"place\"]",
  "contributors": "object[\"contributors\"]",
  "retweet_count": "object[\"retweet_count\"]",
  "favorite_count": "object[\"favorite_count\"]",
  "entities": "object[\"entities\"]",
  "favorited": "object[\"favorited\"]",
  "retweeted": "object[\"retweeted\"]",
  "possibly_sensitive": "object[\"possibly_sensitive\"]",
  "lang": "object[\"lang\"]"
}

And paste it back to the findAll object properties accessors field in the workflow.

Keep "Get object properties using custom action" to "no". This option is used when the JSON format is not standard and requires to be decoded by a custom action.
Submit.

Now vCO will use the properties you have defined to get the property values and display them:



It is now important to check if these values are valid. Below is a sample I copied and pasted to a text editor. I am satisfied with the name property so I will keep it this way. You may notice that properties containing objects kept a JSON format. This is fine as well, these can easily be converted in the vCO scripting. What gets my attention is the fact that there are two id properties. Having the proper ID is very important. Not only it needs to be unique but it will be used by vCO to get the tweet by ID.



With expanding this particular tweet on the Twitter site I can notice that the URL of the tweet is https://twitter.com/CloudOps_Wayne/status/454657048286740480 which matches the id_str property, not the id one.

Since I am not happy with the properties values I keep "Valid object properties" to "No" and submit.

Back to the findAll object properties accessors I paste the JSON string from the JSON editor with taking care of updating the id property to id_str as follows:


Submit.

Now the object properties are showing the right id, right name. Set "Valid Object properties" to "yes".

findById


The workflow now expects the findById HTTP URL. It is statuses/show.json. The API documentation mentions the id parameter is mandatory.
Since we cannot hardcode a particular ud in the URL we will use the {id} placeholder.

We will keep using the executeRequest action as we did before and need to select a specific ID to test invoking the URL. The ID list is pulled from the previous findAll call.



Submit. We now get the JSON response. Copying / pasting in an editor shows that the result is as expected.





Set "Valid response content" to "yes".


As before the workflow pulled the object properties. We will want to do the same changes as before for the id and name properties. Update these as you did before.





And as before you can check that the right properties values for the object with this particular ID were pulled. If so set "Valid object properties" to "yes"





Submit.

findRelation

We can now define the findRelation URL. In the most simple use cases this will return in your inventory all the objects of a given type under a folder. The query will then be the same as for findAll.
More complex use cases would be to use particular requests parameters or filtering returned results based on the parent objects and folders. Repeat the same sequence as for findAll




Submit



Set "Valid response content" to "yes".
Submit


Paste the same changes as done before.



Submit.




Set "Valid object properties" to "yes".

Submit.



For defining the final object properties you have two options : use the properties that are returned by all the previous queries or the ones returned by all of them. In the first case some properties may not be set for some queries, in the last one they should be set consistently. Set yes.



Submit.

Checking the new object type

Check in the Type Hierarchy in the inventory. The workflow created a MentionTimeline object under a Mentions one (the folder)



If you now unfold the Twitter namespace, then the Mentions folder you will get the mentions tweets returned by the Twitter API. This proves findRelation work as expected.




You can then create a worflow with a mentionTimeline type as input and test running the workflow as below.



From a scriptable task you will get access to the object properties using tweet.getProperty(propertyName);


Define a custom operation method

It is nice to have an inventory of object, it is even nicer to have workflows to operate these !

To do this run the "Define an object method"  workflow. Select a namespace and object type your operation will apply to.





Enter the name of the operation / method. Below I want to be able to retweet




Submit.
On the next screen we need to provide the retweet HTTP method and URL as documented in the twitter APi doc.
We will keep the default action to invoke the operation since it works well for twitter.
To test the operation we need to provide an ID of the object this operation will apply to. Below I have selected an ID based on what I could see in the vCO inventory.





vCO execute the operation and show the response content. If this seems right you can set "Valid response content ?" to "yes".





As before the workflow will get the object properties. To make sure the right properties are used you need as before in this tutorial to change the id to id_str and the name to the expression we had entered before.





Submit. The resulting properties are shown. If these are correct. Set "Valid object properties ?" to "yes"







Submit.



Now that we have created the operation we need to create a workflow to call it. Create a new workflow and name it.




Drag and drop on the schema the "Invoke method template" workflow.


Use the setup button at the top of the schema to bind its inputs and outputs to your main workflow. Set these as following
  • dynamicType as input.
  • method as a value set to the name of your operation (below retweet)
  • we will skip content as it is not needed for twitter. Other APIs requires passing content such as a JSON string)
  • updatedObject as an output



Click "Promote"
There are two small things that have to be adjusted. In the attribute page change the dynamicTypeObject type to the type you need to operate.


And do the same for the output parameter.



For convenience set the dynamicTypeObject presentation properties to mandatory and "show in inventory" (which will make your workflow contextual)




Save and close your workflow.


You can now unfold your inventory, select an item, right click and run your workflow (edit the user preferences / inventory to have it shown as below).





And VOILA !

VMware released new and updated vCO plug-ins

$
0
0

vCO has again been extended with a new plug-in an updates of existing plug-ins.

  • VMware vCenter Orchestrator AWS 1.0 Plug-in (New)
  • VMware vCenter Orchestrator Auto-Deploy 5.5.1 Plug-in (Updated)
  • VMware vCenter Orchestrator Multi-Node 5.5.1 Plug-in (Updated)
  • VMware vCenter Orchestrator AMQP 1.0.3 Plug-in (Updated)
  • VMware vCenter Orchestrator PowerShell 1.0.4 Plug-in (Updated)
  • VMware vCloud Automation Center 6.0.1 Plug-in (Updated) - released date April 25, 2014 (article updated to include this entry)




How to use the REST API to Resume a Failed Workflow

$
0
0

One of the relatively new 5.x features of vCenter Orchestrator (vCO) is the ability to Enable a workflow to resume on failure. Essentially, this means that a workflow could fail 1/2 or 3/4 the way through and you could go and tell vCO to resume that workflow, perhaps after fixing whatever issue caused it to fail in the first place, rather than start a fresh instance of the workflow.

 

Introduction

introduction.png

As noted in the intro snippet, vCO now has the ability to let you resume a failed workflow. See the following vCO Documentation page (vCO Documentation on Resuming a Failed Workflows) to learn more about this feature and get it setup. (I recommend doing this on a workflow-by-workflow basis only.) This new feature can be quite helpful as it automatically generates a User Interaction prompt when your workflow fails, allowing you to resume the workflow from where it left off. This could be very helpful when, for instance, your target environment lacks resources for a deployment and the workflow has already progressed through several steps of external integration (IE: Generated a Helpdesk request for tracking, reserved an IP Address, etc...) rather than rolling everything back and starting all over each time a workflow fails.

Failed workflow appearance when Enabled

failed_workflow_appearance_when_enabled.png

  • When that option is enabled, rather than the workflow being in a permanently failed state, upon failure the workflow will enter into "Waiting" state for an interaction as depicted above by the icon next to the workflow execution.
  • The Schema shows you where the workflow had failed by highlighting the failed element in Red.
  • The Variable tab will show the Exception details in the "Exception" window at the bottom in RED TEXT.

Using the vCO Client to Answer

using_the_vco_client_to_answer.png

The process to resume a failed workflow using the vCO Client is the same as answering a User Interaction - Right Click on the workflow execution, then select the "Answer" link.
The Workflow interaction window will come up, allowing you to choose to either "Resume" the workflow or "Cancel" the workflow.

If you chose Cancel and hit Submit, the workflow would cancel out and would no longer be a viable execution to resume.

media_1398455105328.png

However, if you chose to "Resume", the "Parameters" section of input gets loaded with all the Input Parameters for your workflow, allowing you to modify as needed before submitting the workflow to complete from where it had failed.

Okay, great but the title said REST API...

Right, so I wanted to lay a little ground work to make sure you understood the general flow of a failed workflow and what the UI process was before we go off to XML land for the REST API ;)

Before you continue on, be sure you have:

  1. Set the "Resume from failed behavior" to "Enabled" on your test workflow
  2. Have executed the workflow and gotten it to Fail before completing (Feel free to use the attached Test workflow at the bottom of this article.)

Retrieve the Workflow Executions list

retrieve_the_workflow_executions_list.png

Reminder: vCO API Documentation can be found on your vCO Server -- https://your-vco-server:8281/vco/api/docs
In order to retrieve our list of Executions, we need the following information:

  • vCO API URL format --> https://your-vco-server:8281/vco/api/workflows/<workflow-ID>/executions/<workflow-execution-id>
  • Workflow ID --> See item 1 in Screenshot above --> The workflow ID will remain the same across vCO instances. So, if you import the workflow attached to this post, your id will be the same.
  • Workflow Execution ID --> See item 2 in Screenshot above --> this is your workflow execution ID, it is unique for every run of the workflow.

Review results

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<workflow-execution 
    xmlns="http://www.vmware.com/vco" href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/">
    <relations>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/" rel="up"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/" rel="remove"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/logs/" rel="logs"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/state/" rel="state"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/" rel="interaction"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/state/" rel="cancel"/>
    </relations>
    <id>ff808081458f848b01459a60144f0723</id>
    <state>waiting</state>
    <input-parameters>
        <parameter type="boolean" name="isFailWorkflow" scope="local">
            <boolean>true</boolean>
        </parameter>
    </input-parameters>
    <output-parameters/>
    <start-date>2014-04-25T15:32:39.118-04:00</start-date>
    <business-state>Default System Error Handling for item: item2</business-state>
    <started-by>bazbill@VCOTEAM.LAB</started-by>
    <name>3A) Resume tester</name>
    <content-exception>Workflow failed because user decided so</content-exception>
    <current-item-display-name>Workflow Error System Handler</current-item-display-name>
</workflow-execution>

Based on the above information, the URL I need to use is: https://my-vco-server:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/ (Be sure to adjust YOUR request to reflect YOUR workflow ID and Execution ID).
Upon submitting a GET request to that url, the above XML (screenshot) is displayed.

We can see in this execution that there is an "interaction" link -- see Line 9, state is waiting -- see Line 13, and a "content-exception" tag is present -- see Line 24. The 3 of these present in a workflow execution indicates that the workflow has failed and the Resume feature is enabled and waiting for a user interaction.

We now have the link to the interaction, so we can learn more about it by performing a GET on that URL...

GET the interaction

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<user-interaction 
    xmlns="http://www.vmware.com/vco" href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/">
    <relations>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/" rel="up"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/presentation/" rel="down"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/" rel="add"/>
    </relations>
    <input-parameters>
        <parameter type="boolean" name="isFailWorkflow"/>
        <parameter type="Date" name="resume.fail.timeout.date"/>
        <parameter type="string" name="__System_Action"/>
    </input-parameters>
    <name>3A) Resume tester : Workflow Error System Handler</name>
    <state>waiting</state>
</user-interaction>

The interaction XML provides the inputs that are needed for the interaction, but don't provide any decorators, default values, etc... -- see Lines 10-12.

If you wish to see the current values and other presentation options, you can drill down to the /interaction/presentation URL -- see Line 6

Presentation XML

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<presentation 
    xmlns="http://www.vmware.com/vco" id="883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb:ff808081458f848b01459a60144f0723" name="3A) Resume tester : Workflow Error System Handler" href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/presentation/">
    <relations>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/" rel="up"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/presentation/instances/" rel="down"/>
        <link href="https://vco55.vcoteam.lab:8281/vco/api/workflows/883af9aa-7b98-4c6a-8cf5-6ec54f28c3cb/executions/ff808081458f848b01459a60144f0723/interaction/presentation/instances/" rel="add"/>
    </relations>
    <steps>
        <step hidden="false">
            <display-name>Error in workflow</display-name>
            <description>Workflow execution has stopped on error</description>
            <messages/>
            <group hidden="false">
                <messages/>
                <fields>
                    <field type="string" id="__System_Action" hidden="false">
                        <display-name>Choose action to continue</display-name>
                        <description>Choose action to continue</description>
                        <messages/>
                        <constraints/>
                        <decorators>
                            <refresh-on-change/>
                            <drop-down>
                                <array>
                                    <string>Cancel</string>
                                    <string>Resume</string>
                                </array>
                            </drop-down>
                        </decorators>
                        <fields/>
                    </field>
                </fields>
            </group>
        </step>
        <step hidden="false">
            <display-name>Parameters</display-name>
            <description>Modify the parameters for resume</description>
            <messages/>
            <group hidden="false">
                <messages/>
                <fields>
                    <field type="boolean" id="isFailWorkflow" hidden="false">
                        <display-name>isFailWorkflow</display-name>
                        <description>isFailWorkflow</description>
                        <messages/>
                        <constraints/>
                        <decorators/>
                        <fields/>
                        <boolean>true</boolean>
                    </field>
                    <field type="Date" id="resume.fail.timeout.date" hidden="true">
                        <display-name>resume.fail.timeout.date</display-name>
                        <description>resume.fail.timeout.date</description>
                        <messages/>
                        <constraints/>
                        <decorators/>
                        <fields/>
                        <date>2014-04-26T15:32:40-04:00</date>
                    </field>
                </fields>
            </group>
        </step>
    </steps>
    <input-parameters>
        <parameter description="Choose action to continue" type="string" name="__System_Action"/>
        <parameter description="isFailWorkflow" type="boolean" name="isFailWorkflow"/>
        <parameter description="resume.fail.timeout.date" type="Date" name="resume.fail.timeout.date"/>
    </input-parameters>
    <output-parameters>
        <parameter type="string" name="__System_Action"/>
        <parameter type="Date" name="resume.fail.timeout.date"/>
        <parameter type="boolean" name="isFailWorkflow"/>
    </output-parameters>
</presentation>

In the code above, you can see the extra info about the running workflow in the Parameters section of the Steps section .

Prepare POST BODY

As noted earlier, resuming a failed workflow is similar to a User Interaction. The vCO Develop Web Service Documentation ( http://pubs.vmware.com/vsphere-55/index.jsp?topic=%2Fcom.vmware.vsphere.vco_develop_web_services.doc%2FGUID-AF6DEC91-FABD-4ABA-8D77-90CA1CCB79AF.html ) - provides details on this.

After a little testing, I found that the necessary params for my workflow to resume were the "isFailWorkflow" (this was my Input Parameter for the workflow - if your workflow has additional inputs, you should populate them as well) and the "__System_Action" Parameter. The "__System_Action" Parameter is what we saw at the beginning of the article - the drop-down with "Resume" and "Cancel". The third available parameter (resume.fail.timeout.date) is not needed when Submitting the body to resume or cancel the workflow.

Here's the body required to answer the attached test workflow to get it to Resume using the Resume Failed Workflow feature:

<execution-context xmlns="http://www.vmware.com/vco">
   <parameters>
     <parameter name="__System_Action" type="string">
       <string>Resume</string>
     </parameter>
     <parameter name="isFailWorkflow" type="boolean">
       <string>false</string>
     </parameter>
   </parameters>
</execution-context>

Summary/Resources

This article has provided a quick intro to a cool vCO feature and provided a light walk-through of not only using the vCO client to use the feature, but also covered the necessary steps to take advantage of the feature over the REST API :)

Sample Workflow used during article:

Extend vCAC 6 IaaS Lifecycle with vCO introduction video

$
0
0
Last year I have created an extensibility package to simplify and automate the steps necessary to extend the vCAC IaaS lifecycle process with calling out vCO workflows. While this was mainly aimed at accelerating Proof of Concepts it has been since then broadly adopted in production environments. Since vCAC 6.0 it is part of the product and available for vCAC 5.2 as a separate download.
Since this is now becoming a very hot topic I am including here a very useful extensibility introduction video that was released by my colleagues from tech marketing almost three months ago.



http://www.youtube.com/watch?v=yhnKYZzNRQs

VMware Horizon vCenter Orchestrator Plug-in Released

$
0
0
Hello all you end user compute admins! If you've been waiting for a vCO plug-in for your Horizon environment, your wait is over. Join Aaron Black in this "Getting Started with vCenter Orchestrator Horizon Plug In" video.
http://www.youtube.com/watch?v=xDKUE71ltfs
In order to get the plug-in, you must be licensed for Horizon 6.0 Enterprise. For a quick link to the download (login required at VMware's site), click Links -> Plug Ins, then scroll down to the "Horizon" plugin link :)

VMware Horizon on Twitter

Extend Automation Center with F5 AFM using vCO + Dynamic Types

$
0
0
My colleague, Chris Slater at defined by software has published an article on how to extend VMware Automation Center (vCAC) with F5 firewall functionality. I usually do not cross post articles but this one is worth mentioning since it is a real world example on how to leverage vCAC + vCO + Dynamic Types + third party API.


The first part of the article explain what are the different options available to automate F5 including the F5 plug-in for vCO, PowerShell, the big IP CLI, the SOAP and REST APIs and give an overview of the solution.
The second part of the article digs into how to build this integration leveraging my dynamic types plug-in generator package and tutorial.

Thanks for Chris to sharing this, but also thanks to Marc Chisinevski from F5 who built the AFM integration.






Dynamic Input Values Based on other Inputs

$
0
0

A frequent requirement when performing orchestration tasks is to have input fields interdependent. For example, if I input XYZ into the first input, I want the second input to be relevant to XYZ. In the past, I have frequently done such workflows where you select a Datacenter for the first input, then the second input would present a list of Datastores (or VMs or Clusters or Hosts). In this tutorial, I'm going to do something a little different. The first input of this demo workflow will be a string and the second will be an AD:User object chosen from a list of objects that were found in Active Directory based on the first input.

 

Scenario Overview

This article will address the following requirements:
Provide an inteface for admins to login via a web browser to search for and add newly created Active Directory User to one or more groups.

The front-end for this will be vCloud Automation Center (vCAC) 6.1. The Advanced Service Designer (XaaS) will be used to call a custom vCenter Orchestrator (vCO) workflow that actually does the work. Results of the workflow should be visible in vCAC.

This tutorial will touch on the following topics:

  • Link input fields so that the values presented on one depend on another
  • Present vCO Workflow output via vCAC's Requests
  • Parameter bindings in vCO
  • vCO Presentation properties
  • vCO Action creation
  • vCO Worfklow creation
  • Using the vCAC Advanced Service Designer (ASD) to add a vCO Workflow as a Catalog Item
  • Publishing a Catalog Item and tying it to a Service

Note: Throughout this article, I use vCO and vCAC as unofficial abbreviations for VMware's vCenter Orchestrator and vCloud Automation Center respectively. In upcoming versions, these will be vRealize Orchestrator (vRO) and vRealize Automation (vRA).

Environment Requirements

  • vCAC should be setup with the default or externa vCO Server configured server (See Administration -> Advanced Services -> Server Configuration)
  • A service should be configured and entitled in vCAC to place the custom request in (This article will use a Service named "Demos")
  • An account capable of managing Advanced Services in vCAC and requesting services (this may be the same account or separate accounts: admin acct and user acct)
  • The connected vCO Server should have the Active Directory plug-in configured with a SharedSession account in AD with sufficient permissions to add users to groups
  • An account in the vCO Admins group capable of creating new vCO Workflows

Create vCO Workflow Schema

create_vco_workflow_schema.png

This will be the workflow we call from vCAC's Advanced Service Designer later, so be sure everything is set as described:

  1. Create a new workflow called Find and Add User to Group
  2. Drag the "Add a user to a userGroup" library workflow, an exception element, and two Scriptable tasks onto the schema as shown in the diagram above
  3. Create 3 Inputs for this workflow: name (String), user (AD:User), and group (AD:UserGroup)
  4. Create 1 Output for the workflow: outputText (String)
  5. Confirm (create if it does not exist) that an attribute named "errorCode" was created and is bound to the "Exception" tab of the "Add a user to a user group" workflow
  6. Rename each of the Scriptable task elements to "Set Output"

Next, review and/or set Input/Attribute/Output bindings.

Bindings: Add a user to a user group

bindings_add_a_user_to_a_user_group.png

  1. Select the Add a user to user group workflow
  2. In the bottom pane (or click the edit icon to bring up the pop-up window for the details), click ont he Visual Binding tab and confirm or set bindings as shown above
  3. Make sure that group is bound to the In Parameters group and that user is bound to the In Parameters user as shown above

NOTE: If you prefer the behavior shown here where the detail pane is below the selected item rather than clicking the edit icon and having a pop-up window, you can change this on your vCO client:
1) If in edit mode, Save and Close.
2) Click Tools -> User Preferences -> Workflows
3) Remove the check from "Edit workflow items in a pop-up window
With this preference set, you can always pop-out the details pane by clicking the small box in the top right corner (highlighted with a BLUE Arrow in the screenshot above.

Bindings: Normal path "Set Output" Scriptable task

bindings_normal_path_set_output_scriptable_task.png

Bind the user and userGroup inputs to the IN and outputText to the OUT for this element as shown in the screenshot above.

Paste the following in to the Scripting tab of the element:

var outputText = "Added " + user.distinguishedName + " to group " + group.distinguishedName + ". ";

Bindings: Exception path "Set Output" Scriptable task

bindings_exception_path_set_output_scriptable_task.png

Bind the user and userGroup inputs to the IN and outputText to the OUT for this element. Also, bind the errorCode attribute to the IN tab of this element as shown in the screenshot above.

Paste the following in to the Scripting tab of the element:

var outputText = "Error encountered when attempting to add " + user.distinguishedName + " to group " + group.distinguishedName + ". "+errorCode;
System.warn(outputText);
Server.warn(outputText);

Click Save and Close.

vCO Workflow Current State and next Steps

vco_workflow_current_state_and_next_steps.png

At this point, the workflow should be capable of prompting for all three inputs individually and processing the request. However, the name input has no effects on the user input at this stage. That can be setup in the Presentation tab of our workflow, but in order to do that, we'll need an action that:

  • Takes a string as an input
  • Returns an Array of AD:User objects or an Empty Array if no users are found to match the input string

Let's create that action now!

Create a findUserByName Action

create_a_finduserbyname_action.png

If you don't already have a module (Action Folder) to place your new action in, create one now. As shown above, I have created one named:
com.vmware.coe.library.ad.user <- Note the syntax. vmware.com reversed, followed by my dept. "coe", then general and specific categories: library.ad

Once you have the action created, set the Return type to Array/AD:User
Add 1 string input and set the name to name

Next, add the following to the scripting box down below:
if (!name) return []; // IMPORTANT: if the "name" input has not been provided, return an empty array. Returning a null will cause presentation issues and excluding the line will generate invalid calls to the AD Plug-in.

var users = ActiveDirectory.searchRecursively("User",name); // Do a recursive search in Active Directory for User objects with the "name" provided
// If users is not null, perform some System debug
if (users != null){
    System.debug("Found "+users.length+" matching users for: "+name);
    for each (usr in users){
        System.debug(usr);
    }
}
return users; // Finally, don't forget this line!!! you must return your array of users

vCO Workflow Presentation

vco_workflow_presentation.png

Return to the "Find and add user to group" workflow created earlier and enter edit mode.

  1. Click on the Presentation tab
  2. Select the user input
  3. In the bottom pane, click the Properties tab
  4. Click the little blue triangle icon to add a new property. When the window pops up, select the Predefined list of elements and click OK
  5. Click the small puzzle icon to the far right of the window to choose an action that will provide the array of choices. The Search action with result window will pop up and present a list of actions that return an array of objects that are the same type as the input field we have selected. In this case an Array/AD:User. Select the findUsersByName action.
  6. When an action is selected, each of the inputs to the action are presented in the bottom window. To allow us to specify an existing Input Parameter or Attribute, click the drop-down and select the Double-Arrow icon as shown above.
  7. Use the Pencil icon to select an Input or Attribute to be used as the action input. This will present the Linked parameter of type string window
  8. Select the name input
  9. Click Accept

To wrap up our Presentation settings, add the Mandatory input property to all three inputs and set each to Yes

Save and Test vCO Workflow

save_and_test_vco_workflow.png

At this point, it is important to Save and Close the workflow. Be sure to set your initial Version number and Description for the workflow.

Now, Test the workflow in the vCO Client by running it and specifying the first name or last name of an account you know is in Active Directory.

Once you have a name entered the name (Yes, it will lag a little - this is a small bug with the vCO client and will not be present when the workflow is presented to you in vCAC), Click on the Not set link for the Active Directory User Account input. Confirm that you are presented with one or more accounts that have the "name" you specified as part of their accounts. In my case, I entered "developer" and there are 4 Developer accounts to choose from.

Choose one of the accounts, and click Select.

Choose an Active Directory group to add the user to and click Submit.

Confirm that the user gets added to your AD Group.

Try running the workflow again to add the same user to the same group - what happens?

In both results above, review the value of the outputText Output Parameter to confirm you have expected values.

This workflow should now be ready to incorporate into vCloud Automation Center, let's do that now!

vCloud Automation Center - Create a new Service Blueprint

vcloud_automation_center_-_create_a_new_service_blueprint.png

Login to vCAC with your administrative account.
Click the Advanced Services, then select the Service Blueprints entry on the left
Click the Green + Add button to add a new blueprint
Select your Find and Add User to Group workflow that you created in vCenter Orchestrator and click Next
On the next page, place a check in the Hide catalog request information page box and click Next

vCloud Automation Center - Add Form

vcloud_automation_center_-_add_form.png

Add a new Form to the Blueprint so that the output text may be available to the user.

  1. Click the Add Form button (Green +)
  2. Specify a name such as Output Text
  3. Select the Submitted request details entry for the Screen Type
  4. Click Submit

Form Page Header - Output Text

form_page_header_-_output_text.png

  1. Click the New Page (Green +) button to add a new Page to the form
  2. Set the Heading to Output Text
  3. Click Submit

Output Text Page

output_text_page.png

  1. In the left pane, scroll down until you see the Outputs section. This area will have each of your vCO workflow outputs. In our case, there is only 1.
  2. Drag the outputText field onto the form page, just under the "Form page" dropdown box as shown above
  3. Click the Edit icon (pencil)
  4. Update the Label to be more user friendly: Output Text
  5. Set Size to Large
  6. Click Submit
  7. Click Next
  8. On the Provisioned Resource page, there is nothing to change so click Add
media_1416431710920.png

  1. When you return to the Service blueprints page, select the row containing your new workflow
  2. Click Publish

Your new vCO workflow is now published as a Catalog Item. But, in order to allow anyone to see it, we need to have it entitled. In this environment I have a "Demos" Service that already has entitlements setup so all I need to do is specify the "Demos" Service on my Catalog Item. Let's do that now.

media_1416432038100.png

Click Administration -> Catalog Management -> Catalog Items
Select the row containing the Find and Add User to Group item
Click Configure

media_1416432108771.png

On the Configure Catalog Item page:

  1. Browse for and add a custom icon if desired
  2. Confirm that Status is Active
  3. Click the Service dropdown and select the appropriate Service to tie the Catalog Item to. In my case, this is the Demos service
  4. Click the Entitlements tab to confirm that Entitlements exist for the selected service
  5. Click Update when done

You are now ready to test the request via the Catalog tab!

vCAC - Catalog - Demos

vcac_-_catalog_-_demos.png

The integration should be ready to test. At this point, if necessary, log out of the admin account then login using an account that is entitled to the Service specified in the previous step. My admin account is entitled, so I will continue on with the same account.

  1. Click on the Catalog tab
  2. Select the Service (in my case Demos) from the list on the left to narrow the available requestable items (If your account is only entitled to 1 service, then your screen will not have the services listed down the side of the screen.
  3. Click the Request button

New Request

new_request.png

  • Enter a valid given or surname in the name input then tab or click into the user dropdown
  • After a brief moment (vCO Action running in the background), the dropdown should populate with valid AD:User accounts
  • Select an account, then select an AD Group and Submit the request
  • Click OK on the confirmation page

Review Requests

review_requests.png

Click on the Requests tab
Locate and click on the Request Number that matches the request you just submitted. In the screenshot above, I click on the 23 at the far left of the table

Request Details

request_details.png

Review the General, Step, and Output Text tabs.

Next Steps

See if you can customize this process to accept multiple groups to add the user to. There was some good information here that taught reusable techniques. I hope you enjoyed this tutorial. Please share if you found it worthwhile!

SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 1

$
0
0

This multi-part series will step you through the process of mapping a SQL table into vRealize Orchestrator, building out a DynamicTypes plug-in inventory based on that table (using my workflow package), then exposing it to vRealize Automaion's Advanced Service Designer (ASD).

Introduction

vRealize Automation (vRA) features an Advanced Service Designer (ASD) that allows for you to offer nearly anything as a service (XaaS). In order to take advantage of that feature, it requires a vRealize Orchestrator (vRO) Inventory object. This means you must have a plug-in that provides such an inventory. In the past, this meant Java skills to build out a plug-in. This, fortunately, is no longer the case with the Dynamic Types plug-in. We touched on this plug-in in the past with regards to using the HTTP-REST plug-in. This article will take a different approach in that we will use the SQL Plug-in to provide our back-end service - a mini CMDB consisting of Server names and IPs.

 

This first article will cover the following topics:

  • Microsoft SQL Server Database Table for use in tutorial
  • How to determine JDBC URL for use with the SQL Plug-in (or jdbc plug-in)
  • Adding the Database table to vRO

Future articles will build upon the work performed here. They will cover:

  • How to create a Dynamic Types Plug-in Inventory using my workflow package
  • How to add the simple CMDB to vRAs Advanced Service Designer
  • How to add entries to the CMDB via vRA Infrastructure as a Service (IaaS) workflow

The Database

the_database.png

For this tutorial series, we will keep things very simple. Our CMDB will consist of a single "Assets" table. The table will reside in a database named "CMDB" on a SQL Server 2008 R2 server (vcac-iaas-01a.corp.local).

The table will have only 5 fields:

  1. ServerName - Hostname of the server
  2. ServerIP - IP Address of the server
  3. ServerDNS - Domain that server resides in
  4. ServerID - Unique ID of the server, for simplicity, this article will simply use the same value as the ServerName
  5. ServerOwner - e-mail address of the person who provisioned the system

Before proceeding, make sure you have a simple table created and available for you to connect to. Also be sure to note whether or not your database is listening on its standard port. In my case, SQL Server IS listening on port 1433.

Database Plug-ins and JDBC URL

database_plug-ins_and_jdbc_url.png

vRO features two plug-ins capable of communicating with external databases: SQL Plug-in and JDBC Plug-in. Regardless of which plug-in you decide to use for your solution, you will require a JDBC URL to connect to a database. The syntax for the url can be tricky, but vRO ships with a handy workflow to help you figure that out: JDBC URL generator

Run that workflow now.

JDBC URL generator - Screen 1

jdbc_url_generator_-_screen_1.png

In my lab environment, the vcac-iaas-01a server is running SQL Server 2008 R2 and is a member of the corp.local domain. Additionally, the default instance is being used and Windows authentication is enabled.

  1. Use the drop-down to select your database. Based on my note above, I will select the SQL Server/MSDE entry
  2. Specify either the FQDN or IP Address of the Database Server
  3. Provide the name of the database you wish to connect to
  4. (OPTIONAL) Provide the database port - if left blank, the default port will be used as specified by the description above the inputs based on the type of database chosen
  5. Provide the username of an account with permission to add/remove content to the database
  6. Provide the password for the account

JDBC URL generator - Screen 2

jdbc_url_generator_-_screen_2.png

If the Microsoft database (SQL Server/MSDE) was chosen in the first screen then an additonal screen will be presented so that you may specify an optional instance name and domain name.

  1. If using the default instance of SQL Server, it should be safe to leave this field blank. Otherwise, specify the appropriate Instance Name that holds your database.
  2. If your database server is a member of a domain and is using WIndows based authentication, enter the Windows Domain here as shown in the screenshot above. If you are using a standalone SQL Server with Windows authentication (local accounts) then specify the server's HostName in this field.

Click submit

Upon submission, vRO will construct the JDBC URL then attempt a connection to the database using that URL and the credentials provided.

JDBC URL generator - Results

jdbc_url_generator_-_results.png

Review the Logs tab or the Variables tab to obtain your JDBC URL (Connection string).

If the workflow fails, you may need to try different variations of your inputs. Most commonly, I have seen issues around not having the correct port specified / DB listening on dynamic port instead of default, specifying invalid credentials, specifying an invalid database instance, or missing/incorrect domain name.

When successful, the Logs tab should contain text like this:
[2014-12-02 10:13:57.831] [I] Connection String: jdbc:jtds:sqlserver://vcac-iaas-01a.corp.local:1433/CMDB;domain=corp
[2014-12-02 10:13:58.170] [I] Connection to database successful

You should now have the necessary connection string for use in the SQL and JDBC workflows to connect to a database!

Add SQL Database to vCO

add_sql_database_to_vco.png

Now that we have the JDBC URL, we can run the Add a database workflow (found under Workflows\Library\SQL\Configuration)

For the Name input, this is the display name shown in the vRO Inventory so set it to something friendly and that makes sense. In my case, I will simply call it CMDB - I could just as well name it "Server Inventory).

Choose the Database Type, then paste in your connection url from the workflow you ran in the last step.

On the next screen, I suggest using Shared Session, then providing the same credentials as you used in the JDBC URL generator workflow.

Upon submission, your Logs tab should display something similar to this:
[2014-12-02 10:57:19.071] [I] Database object added: DynamicWrapper (Instance) : [SQLDatabase]-[class com.vmware.o11n.plugin.database.Database] -- VALUE : Database[name: CMDB, type: MS SQL, connectionURL: jdbc:jtds:sqlserver://vcac-iaas-01a.corp.local:1433/CMDB;domain=corp, sessionMode: Shared Session, username: administrator]

Verify SQL Inventory

verify_sql_inventory.png

View the Inventory of your vRO Server and expand the SQL plug-in to confirm that the "CMDB" entry is there and you can see the tables in the database.

Create Folder for CMDB Workflows

create_folder_for_cmdb_workflows.png

Return to the workflows tab in the vRO client and create a new root folder named "CMDB Workflows"
We will use this folder in the next step...

Generate Workflows to manage the table

generate_workflows_to_manage_the_table.png

We are now ready to generate some simple CRUD (Create Read Update Delete) workflows for our Assets table.

  • Run the Generate CRUD workflows for a table workflow
  • Select the Assets table and then specify the CMDB Workflows folder (Workflow Category) as the destination
  • Click Submit

Verify Workflows are Generated

verify_workflows_are_generated.png

Confirm that the four workflows have been created in the folder specified...

NOTE: It seems that in version 5.5.2.1 of vRO, there is a small bug (I have just reported this to Engineering) that does not bind the "databaseResource" attribute to the ResourceElement object properly. This "Not found" value will prevent the workflows from validating or running properly. If this happens for you, it is pretty simple to fix. Just EDIT each of the four generated workflows, and DELETE the "databaseResource" attribute from the General tab AND remove it from the "IN" binding of the Scriptable task.

Wrapping up...

This concludes the first article in the series. Stay tuned for additional articles that will build upon the database table we have mapped here in this article.

SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 2

$
0
0

Welcome back! This is the second article of a multi-part series that steps you through the process of mapping a SQL table into vRealize Orchestrator, building out a DynamicTypes plug-in inventory based on that table, then exposing it to vRealize Automation's Advanced Service Designer (ASD). In the first article, we got our database table mapped using the SQL Plug-in and generated some CRUD workflows.

Introduction

Let's build a simple Dynamic Types plug-in around our SQL Table that we created in our previous article.

NOTE: For an overview of the Dynamic Types plug-in, review this great article by Christophe!

This second article will cover the following topics:

Future articles will build upon the work performed here. They will cover:

  • How to add the simple CMDB to vRealize Automation's Advanced Service Designer
  • How to add/remove entries to the CMDB via vRealize Automation Infrastructure as a Service (IaaS) workflow

Import Package

Building out this solution by hand would take hours of testing and digging through code to figure things out. Instead of stepping you through that process, I have uploaded a package to the VMware Communities for use with this article. Download that package and import it to your vRO server now. https://communities.vmware.com/docs/DOC-28573

media_1417816006050.png

  1. Once you have imported the package, run the "1 - Define Plug-in Namespace and DB" workflow.
  2. Provide a Namespace for your plug-in - this will be reflected in your object types once they have been defined. I will use MyCMDB here.
  3. Specify the SQL Plug-in Database you are creating the plug-in for
  4. Click Select to choose the database
  5. Click Submit when done

We need a root element in our Inventory to hold the various objects. This root element is our Namespace. The Namespace is also used in the construction of the Object Type name as follows:
DynamicTypes:MyCMDB.Asset

DynamicTypes = The Plug-in that has the object
MyCMDB.Asset => This is the Object Type (will be creating that soon...)
To break this down further:
MyCMDB = Namespace
Asset = Type

The Database we specified during namespace creation was added as a Custom Property of the Namespace object so that it may be easily referenced in later workflows and actions.

Review vCO Inventory 1/2

review_vco_inventory_12.png

  • Click on the Inventory tab, then expand out Dynamic Types to see that you now have a Namespace listed: MyCMDB
  • The right pane gives details on the selected Namespace

Review vCO Inventory 2/2

review_vco_inventory_22.png

Click on the Custom Properties tab in the right pane
Note that the "CMDB" SQL:Database object has been stored in the "sqlDb" key as a custom propery
This allows for easy retrieval in workflows using the following syntax:

// Get Namespace object by name:
var ns = DynamicTypesManager.getNamespace("MyCMDB");
var sqlDb = System.getModule("com.vmware.library.customProperties").getCustomProperty(ns,"sqlDb");
// at this point, sqlDb is now a SQL:Database object that we can use for further scripting, queries, etc...

Define Type(s)

define_type_s_.png

Now, we must define one or more Types. In my case, I want my type to be Asset so that's what I'll specify when I run the workflow...

Run the "2 - Define a Type and its parent folder" workflow

  • Select your Namespace that was created in the previous step (Once selected, the "Parent Folder name" input will populate a dropdown list with the names of all the tables in the associated DB)
  • Select a Parent folder name (Database table - this will be the name of the root inventory folder that holds the table records. Once a selection has been made here, the table fields will be loaded into dropdown lists for the ID and Name inputs)
  • Specify a Type Name (This will be the name of the object/table record entry you wish to be able to request.)
  • ID Field: Each Dynamic Type object requires an ID, choose one of the table fields to be used as the ID.
  • Name Field: Each Dynamic Type object requies a Name, choose one of the table fields to be used as the Name.
  • Choose an icon (Resource Element image file) to represent your Object Type in the inventory (this is the image to be used as an inventory icon. You may add more images by uploading images to vRO Resource elements)
  • Click Submit

NOTE: The ID and Name fields that get mapped here MUST be mapped to table fields that will never be null.

Check Inventory Custom Properties

check_inventory_custom_properties.png

The workflow that was run in the last step bound the selected name and id field names as custom properties to the Dynamic Type for the "Asset" I specified. These fields are from the DB table we chose - and we can also see that the table name was bound. In my case, this is "Assets". Binding these values as custom properties allow for easy retrieval by the findAll and findById actions that help make the DynamicTypes work!

Check Inventory

check_inventory.png

Now, expand out your Namespace and observe that each record in your DB Table has an inventory object showing under the folder. As you can see in the screenshot above, my table currently has two entries: controlCenter and vco.

Create a Record

create_a_record.png

Remember those CRUD workflows we created in Part 1? It's time to use those now!

  1. Create a new Workflow named something like "Create a Record"
  2. Drag the "Create active record for 'Assets'" CRUD workflow onto the schema
  3. Drag the "getDbObject" and "getParentObject" actions from the "com.vmware.coe.dynamicTypes.sql" Action module onto the schema AFTER the Create workflow - be sure to place them as shown in the screenshot above
  4. Drag the "invalidateObject" action from the "com.vmware.coe.dynamicTypes" action module onto the schema after the other actions
  5. Add a Scriptable task to the end of the workflow and name it "Map Output"
  6. Click back on the "Create active record ..." workflow
  7. Now click the Setup button that appears in the top right of your schema

Create a Record - Promote Workflow Input/Output Parameters

create_a_record_-_promote_workflow_inputoutput_parameters.png

The exact steps you take here will be up to you... however, to keep things simple, I have set the "isUnique" Mapping Type to "Value" with a Value of "Yes" - this will create an attribute named isUnique with a value of Yes and bind it to the sub-workflow input for me.

The rest of the Mapping Types were left as Input because I will want these provided by the user/vRealize Automation in a future article.

For the Output Parameter, rename the variable to something other than "result" - I chose "newRecord" and set the Mapping Type as "Local Variable" (Attribute).

When done, click the Promote button.

Note: If you set any of the Mapping Types to "Skip" or "Value" you will need to provide some sort of value by editing the sub-workflow "IN" tab or the Create a Record General tab in case you have set to "Value"

Create a Record - Presentation

create_a_record_-_presentation.png

Now that the inputs have been defined for this workflow, it is important to jump over to the Presentation tab and set your ID and Name fields as Mandatory inputs. Failure to include either of these in a record added to the database will result in a broken Inventory!

Once those are marked as Mandatory, return to the Schema tab.

Create a Record - Review sub-workflow Visual Binding

create_a_record_-_review_sub-workflow_visual_binding.png

At this point, all IN parameters for the sub workflow should be bound to either IN Parameters or In Attributes and the OUT Parameter should be bound to an Out Attribute as shown in the screenshot above.

Create a Record - getDbObject

create_a_record_-_getdbobject.png

This getDbObject action will retrieve the DynamicTypes object based on the ID of the newly created database record. The object is needed for some additional actions, and then so that it can be mapped to an output and returned to the calling system/user.

  1. Click on the getDbObject action (Note: Don't worry about the type showing DynamicTypes in the screenshot, that will get fixed during workflow validation)
  2. Click on the Visual Binding tab
  3. Drag the "objectId" from getDbObject to the field you specified as being the record id field. In my case, this is the "ServerID" Input
  4. Drag the "type" from getDbObject to the "In Attributes" box in the lower left and release - a "Create parameter" window will pop up - Specify name as getDbObjectType with a value of MyCMDB.Asset (please see note below), click OK
  5. Drag the "actionResult" from getDbObject to the "Out Attributes" box in the lower right and release - a "Create parameter" window will pop up - Specify name as newAsset, click OK

Note:
The format of the "type" field for this action is: namespace.type
Based on the choices I made earlier in this article, this means I need to specify "MyCMDB.Asset" as the "type" to pass into this action. If you made different choices, be sure to specify based on YOUR Namespace and YOUR type.

Create a Record - getParentObject

create_a_record_-_getparentobject.png

Once we have the newly created object, we need to get the parent object (folder) so that the DynamicTypes plug-in can be notified that the contents of the folder has changed.

  1. Click on the getParentObject action
  2. Click on the Visual Binding tab
  3. Drag "object" from getParentObject to the "newAsset" In Attribute in the lower left corner. (The small triangle will turn green when your mouse is over newAsset to indicate it is ready for binding)
  4. Drag "actionResult" from getParentObject to the "Out Attributes" box in the lower right and release in an empty area so that the Create Parameter window pops up. Name the parameter "parentFolder"

Create a Record - invalidateObject

create_a_record_-_invalidateobject.png

The invalidateObject action takes a DynamicTypes object as an input and notifies the DynamicTypes plug-in that it is no longer valid and should be refreshed. For our purpose, this will result in the folder refresh discovering it has a new child object.

  1. Click on the invalidateObject action
  2. Click the Visual Binding tab
  3. Drag "object" from InvalidateObject to the "parentFolder" In Attribute in the lower left of the screen. Make sure to release on that attribute when the triangle is green to bind them as shown in the screenshot above

Create a Record - Map Output - IN

create_a_record_-_map_output_-_in.png

Now that we've created and retrieved our object and notified the plug-in that the parent folder needs to be refreshed, we can now map our output.

  1. Click the Map Output Scriptable Task
  2. Click the In tab
  3. Click on the "Bind to workflow parameter/attribute" button to create a new input binding.
  4. Select the existing newAsset attribute that was created earlier
  5. Click Select

Create a Record - Map Output - OUT

create_a_record_-_map_output_-_out.png

  1. Now click on the OUT tab
  2. Click on the "Bind to workflow parameter/attribute" button to create a new output binding.
  3. The Output Parameter does not exist yet, so click the link to "Create parameter/attribute..."
  4. Specify name as assetOut
  5. Use the Filter box to search for the asset object type
  6. Select the DynamicTypes:MyCMDB.Asset Type
  7. Change the Create option to "Create workflow OUTPUT PARAMETER with the same name
  8. Click OK

Workflow Validation

Click the Save button in the bottom right of the client.
During the course of dragging actions and binding, there were a few instances where object type ANY was bound to specific object types such as DynamicTypes:MyCMDB.Assets
It is a good idea to Validate the workflow now to reset bindings to eliminate warnings and verify that we did not miss any important variable bindings.

Once your workflow has been validated, Increment your Version number on the General tab, give the workflow a meaningful description, then click Save and Close

Delete a Record

delete_a_record.png

Now that we have a workflow capable of creating new records, we should create one that can delete them. It is important that the input be of the correct object type .

  1. Create a new workflow named "Delete a Record"
  2. For the first element, add a Scriptable task and name it "Get ID"
  3. Click the IN tab for that Get ID Scriptable task
  4. Create a new parameter named "asset" of type "DynamicTypes:MyCMDB.Asset" and make sure to set as an IN PARAMETER. Click the OUT tab and Create a new parameter named "assetId" of type (string) and keep the default setting so the bound variable will be an Attribute.
  5. When the input parameter has been setup, click Select
  6. Add the getParentObject from com.vmware.coe.dynamicTypes.sql after the scriptable task
  7. Next, Add your "Delete active record ..." CRUD workflow from the first article
  8. Finally, add the invalidateObject action from the com.vmware.coe.dynamicTypes as the last element before the workflow End

Delete a Record - Get ID

delete_a_record_-_get_id.png

Click on the Scripting tab of the Get ID scriptable task and add the following code:
var assetId = asset.id;

Now compare your environment with the screenshot here.

Delete a Record - getParentObject

delete_a_record_-_getparentobject.png

Select the getParentObject action, then go to the Visual Binding tab
Drag "object" from getParentObject to the In Parameters and bind it to the "asset" input that is there
Drag "actionResults" from getParentObject to the Out Attributes and release in the empty area - when prompted to create a new attribute, name it parentFolder

Delete a Record - Delete active record (sub-workflow)

delete_a_record_-_delete_active_record__sub-workflow_.png

The Delete active record CRUD workflow that was auto-generated will have a number of inputs. The most important input there is the one that you mapped as the "ID" when setting up the object type. In the case of this article, this is the "ServerID" input. Setup the inputs as shown in the screenshot above.

isUnique should be bound to an attribute named isUnique (boolean) with a value of "Yes"
ServerID should be bound to the attribute named assetId that we created when setting up the Get ID Scriptable task in this workflow
The remaining inputs may be bound to NULL (Do this by clicking on Not Set, then double-clicking NULL. All Inputs/outputs of workflow elements must be bound to either NULL or a valid value)

On the OUT tab, there is a single "result" output of type number. This identifies how many records were deleted. For simplicity in this article, I bind mine to NULL. However, as a best practice, I would actually recommend binding this to an attribute named something like recordsDeleted and then use a decision box to determine if recordsDeleted == 1. If so, continue on, if not perform some error handling since we expect a record to be deleted here.

Delete a Record - invalidateObject

delete_a_record_-_invalidateobject.png

The invalidateObject action takes a DynamicTypes object as an input and notifies the DynamicTypes plug-in that it is no longer valid and should be refreshed. For our purpose, this will result in the folder refresh discovering it has a new child object.

  1. Click on the invalidateObject action
  2. Click the Visual Binding tab
  3. Drag "object" from InvalidateObject to the "parentFolder" In Attribute in the lower left of the screen. Make sure to release on that attribute when the triangle is green to bind them as shown in the screenshot above

Save, Validate, then Save and Close

Test The Workflows from vRO Client

Using the vRealize Orchestrator client, make sure that your Create a Record and Delete a Record workflows each result in the appopriate results: Adding and Removing records from your database table AND from your Inventory.

Wrapping up...

This concludes the second article in the series. Stay tuned for additional articles that will build upon the workflows we have created here in this article.

Get Addresses in Range

$
0
0
vRealize Orchestrator (vRO) is frequently used with Network related automation which may involve working with IP Addresses. From an end user perspective, it is nice to specify a range of addresses such as 192.168.1.1-192.168.1.100 rather than having to specify all addresses. I found some simple Javascript in this Converting IP Addresses article that is easily adapted to vRO. You can use the code included in this article to either return an array of addresses in the range specified, or simply it by returning the total number of addresses in the range. Either way, I hope you find this code helpful in your workflows.
  1. To get started, create an action named getAddressesInRange
  2. Set the Return Type to Array/string as this will be an array containing ALL the addresses in the range
  3. Define two inputs: startIP and endIP - both as strings and give them Descriptions as "Starting IP Address" and "Ending IP Address" respectively
  4. Now, paste in the following code for the script
var startNum = dot2num(startIP);
var endNum = dot2num(endIP);

var ipList = new Array();

System.debug("Total Addresses in Range: "+((endNum + 1)-startNum));

for (i = startNum; i != endNum +1 ; i++ ){
    System.debug(num2dot(i));	
ipList.push(num2dot(i)); } return ipList; // Function source: http://javascript.about.com/library/blipconvert.htm function dot2num(dot) { var d = dot.split('.'); return ((((((+d[0])*256)+(+d[1]))*256)+(+d[2]))*256)+(+d[3]); } function num2dot(num) { var d = num%256; for (var i = 3; i > 0; i--) { num = Math.floor(num/256); d = num%256 + '.' + d; } return d; }

 Some example output:

[2015-01-08 14:57:54.182] [D] Total Addresses in Range: 15
[2015-01-08 14:57:54.183] [D] 192.168.1.1
[2015-01-08 14:57:54.185] [D] 192.168.1.2
[2015-01-08 14:57:54.187] [D] 192.168.1.3
[2015-01-08 14:57:54.188] [D] 192.168.1.4
[2015-01-08 14:57:54.190] [D] 192.168.1.5
[2015-01-08 14:57:54.192] [D] 192.168.1.6
[2015-01-08 14:57:54.193] [D] 192.168.1.7
[2015-01-08 14:57:54.195] [D] 192.168.1.8
[2015-01-08 14:57:54.196] [D] 192.168.1.9
[2015-01-08 14:57:54.197] [D] 192.168.1.10
[2015-01-08 14:57:54.199] [D] 192.168.1.11
[2015-01-08 14:57:54.199] [D] 192.168.1.12
[2015-01-08 14:57:54.200] [D] 192.168.1.13
[2015-01-08 14:57:54.201] [D] 192.168.1.14
[2015-01-08 14:57:54.202] [D] 192.168.1.15

 

SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 3

$
0
0

Welcome back! This is the third article of a multi-part series that steps you through the process of exposing our workflows from the last article to vRealize Automation's (vRA) Advanced Service Designer (ASD).

Introduction

This third article will cover the following topics:

  • How to add the simple CMDB to vRealize Automation's Advanced Service Designer
  • Add a Day 2 operation to delete an Asset from our table

Future article will cover the following topic:

  • How to add entries to the CMDB via vRealize Automation Infrastructure as a Service (IaaS) workflow

Let's get started - Login to vRA with your Tenant Admin credentials. (NOTE: Be sure that your account has the "Application Architect" role assigned since you will need to have the "Advanced Services" tab available to you in vRA.)

Advanced Services - Custom Resource

Custom Resources map to vRealize Orchestrator Inventory Object types. In the previous articles of this series, we introduced a new Object Type named DynamicTypes:MyCMDB.Asset. We must add this to vRA so that the object is something that can be requested and, more importantly, managed. By managed, I mean that it will be a provisioned resource that is visible on your Items tab in vRA and can have Day 2 operations assigned to it.

Add Resource

add_resource.png

  • In vRA, navigate to Advanced Services -> Custom Resources
  • Click on the Add button to add the new Custom Resource
  • When the page loads, enter "DynamicTypes" in the Orchestrator Type field - you should see a number of object types populate
  • Select the "DynamicTypes:MyCMDB.Asset" entry as shown in the screenshot above
  • Provide a Name as you wish it to appear in vRA
  • Optionally provide a Description and Version
  • Click Next

NOTE: If you have just added new Dynamic Types to the Orchestrator server that vRA is using, they may not show here right away. If this is the case, you can force vRA to "refresh" its view of available DynamicType objects by:

  • Go to Administration -> Advanced Services -> Server Configuration
  • Change your server configuration from one option to the other (for example, if set as "Use the default Orchestrator server," change to "Use an external Orchestrator server" and vice-versa. Even if you have a single Orchestrator, that is fine - just change between default/external or external/default) - Click Update
  • Now change back to your original setting and click Update
  • This process essentially reconnects vRA to the Orchestrator API and refreshes the list of available DynamicTypes Object Types

Add Resource - Details Form

add_resource_-_details_form.png

There is no need to change any fields here as they are not displayed to your users. Just click the Add button.
Now that vRA understands what a MyCMDB.Asset is, we need to provide a means for the users to actually request it. This means we need a Service Blueprint.

Service Blueprint - Workflow

service_blueprint_-_workflow.png

  • Click on the Service Blueprints link on the left of your vRA page (still on the Advanced Services tab)
  • Click the Add button to define a new Service Blueprint
  • Expand out your Orchestrator server and locate your Create a Record workflow created in the previous article
  • Select it and click Next

Service Blueprint - Details

service_blueprint_-_details.png

The Name and Description on this page are what your users will see when viewing the Catalog so make these user-friendly.

  • Change the Name field "Create a Record" to something like "Add an Asset" or "Request an Asset" or whatever is to your liking. For this tutorial, I will change it to "Add an Asset"
  • If you wish to hide the catalog request information page (typically not needed), place a check in the box
  • Optionally add Version info (NOTE: If adding Version, must be in format of major.minor.micro[-revision] for example 1.0.0 OR 1.0.1-1)
  • Click Next

Service Blueprint - Blueprint Form

service_blueprint_-_blueprint_form.png

The Blieprint Form is the actual form your users will fill out after they click on the Request button of the Catalog entry.
You may leave the form as-is or customize the Display name of the inputs and/or hide or set default values. At the very least, be sure to mark your required inputs as described in the "NOTE" below. Notice how the two "Required" fields are indicated with a red *.

NOTE: VERY IMPORTANT - The fields you specified as ID and NAME (in my case, Serverid and Servername respectively) MUST be filled out in order for the DynamicTypes to work properly. Failure to enforce this will result in a broken inventory. During the workflow creation in vRO, we specified these two fields as being mandatory but this input property did not carry over to the form designer. It is highly recommended to:

  • Mouse-over the Servername input field, then click the edit icon (pencil) that appears to the far right
  • Click on the "Constraints" tab of the pop-up window, then change "Required" to "Constant" with a value of "Yes", then click Submit
  • Repeat the steps above for the Serverid input

Click Next when done modifying the form

Service Blueprint - Provisioned Resource

service_blueprint_-_provisioned_resource.png

In some cases, you may want to allow vRA to present a request form that runs an Orchestrator workflow that does not result in a provisioned/manageable resource. However, this tutorial is not one of those cases. We want the result of a request to be visible in our Items tab so we must identify a Custom Resource that will be created when the workflow has been completed.

  • Click the Drop-down box and select the "assetOut[Asset]" entry. This is the Output field of the workflow we selected. We can choose it here because it has the same type as a Custom Resource we defined earlier in this article.
  • Click Add

Service Blueprints - Publish

service_blueprints_-_publish.png

Once a Service Blueprint has been created, it is immediately placed in "Draft" status. This means it will not be available as a catalog item to be assigned to services. Before navigating away from this page:
Select the "Add an Asset" row, then click the Publish button
Confirm that the Status of that row changes to Published

We still need to Add this Service Blueprint/Catalog Item to a Service so that it may be entitled, but let's go ahead and add our Day 2 operation while we're still in the Advanced Services tab first.

Day 2 Operation - Delete Record

day_2_operation_-_delete_record.png

Click Resource Actions, then click the Add button to create a new one.
This time for Select a Workflow, select your Delete a Record workflow and click Next

Day 2 Operation - Input Resource

day_2_operation_-_input_resource.png

The Resource type and Input parameter should already be selected as "Asset" and "asset" respectively so click Next

Day 2 Operation - Details

day_2_operation_-_details.png

The Name specified here is what users will see as an avaialble action for the selected resource.

  • Change the value to something appropriate. For the purpose of this article, I will change mine to Delete Asset
  • You may optionally Hide the catalog request information page ( I will ) as well as set a Version
  • For Type, be sure to tick the Disposal field so that vRA knows that the provisioned Item is being disposed
  • Leave Target Criteria as the default
  • Click Next

Day 2 Operation - Form

day_2_operation_-_form.png

Although the workflow input is already set for the Asset, it may be a good idea to provide some text here as a warning.
Scroll to the bottom of the left pane until you see the "Form" section.
Drag a "Text" piece to the right pane where it says "Drag an item from the palette"
Click the Edit (pencil) icon for the Text, change the Size to Large, and for text enter your warning text.
Click Add when done (when prompted for the empty form action, click OK)

Resource Action - Publish

resource_action_-_publish.png

Like the Service Blueprint, the default status of a newly created Resource Action is Draft so select it and click Publish.

Services / Entitlements

services__entitlements.png

Okay, so we have a new Service Blueprint/Catalog Item and a Resource Action, and they're both published, but at this point nobody can request or do anything with the Asset. The final steps to get this functional are to:

  • Add the Catalog Item to a Service
  • Entitle the Service (if new) and Resource Action

To keep this new offering separate from my existing services (IaaS, PaaS, and SaaS), I will create a new Service called CMDB

Under Administration -> Catalog Management -> Services, click Add to create a new service named CMDB and Activate it.
Select your newly created Service (or an existing one if you already have one to use)
Click the "Manage Catalog Items" button

Services - Manage Catalog Items

services_-_manage_catalog_items.png

Click the Green + to add Catalog Items to this service
When the pop-up window appears, select the "Add an Asset" item we created earlier in this article (If it does not show, perhaps you forgot to Publish it)
Click Add when done

Entitlement

entitlement.png

If you have an existing Entitlement you wish to use, go ahead and add the Catalog item and Resource action to that Entitlement.

Navigate to Administration -> Catalog Management -> Entitlements , then click the Green + Add button to create a new Entitlement if not using an existing one.

Give your new Entitlement a meaningful name, I've chosen "CMDB Entitlement"
Set the status Active
Select the Business Group you wish to entitle the CMDB services to, I've chosen "RP PreProd Group"
Finally, Select one or more Users & Groups - you'll need to type a name and press Enter to see the selection list show
Click Next when done

Entitlement - Items & Approvals

entitlement_-_items___approvals.png

On the Items & Approvals tab, you can specify which Services, Catalog Items (if not assigned to a Service), and Actions to include in the Entitlement. You can optionally specify an approval policy for each as well.

Next to "Entitled Services", click the Green +, then select the recently created Service "CMDB"
Next to "Entitled Actions", click the Green +, then select the recently created action "Delete Asset"
Click Add

Catalog - CMDB

catalog_-_cmdb.png

You are now ready to test the solution!

  • Be sure you are logged in with an account that has been entitled and click on the Catalog tab, then selct your CMDB services group
  • Click Request on the Add an Asset Catalog Item

Catalog - New Request

catalog_-_new_request.png

Fill out the form as desired and click submit!

Items - Dynamic Types

items_-_dynamic_types.png

Click on the Items tab
(If you don't immediately see your requested item, give it a moment, then cick the Refresh icon at the bottom of the window
Select the row of your new item, then click the Actions button - a list of all your Day 2 operations/Resource Actions that have been entitled for you on this object type will be shown - Don't click Delete yet!

Click the View Details button or the Name of your item to view the item details.

Item Details

item_details.png

On the details page, you can see more information on the request.
On the right-side of the screen, all entitled Resource Actions for the object type provisioned will be shown.
Go ahead and click on the Delete Asset link now.

Delete Asset

delete_asset.png

Notice how the warning text we provided earlier is now shown.

Click Submit to delete the provisioned item.

It will take a minute or so for the item to be gone from vRA. If desired, click the Refresh button at the bottom of the page to verify the item has been deleted.

vRealize Orchestrator - Completed Workflows

vrealize_orchestrator_-_completed_workflows.png

During the course of this article, we Requested an Asset and later deleted it. The engine behind the Request/Delete operations was Orchestrator and the workflows we created in a previous article.

If you log into your Orchestrator client and check your "Create a Record" and "Delete a Record" workflows, you will see completed workflows as shown above.

Wrapping up...

This concludes the third article in the series.

Previous Articles in series:

  1. SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 1
  2. SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 2

How to use Python to start an Orchestrator Workflow

$
0
0

In a previous article, I taught you how to explore and use the REST API to start a Workflow using a generic browser based REST Client. In this article, I will provide a Python based example of running the "Create a Record" workflow that was created in Part 2 of my SQL Plug-in Dynamic Types Simple CMDB for vCAC article. Since I'm not even close to being proficient with Python, this will be a very short article! You may download the script in this article from my vroClientScripts Repository on GitHub.

 

Getting Started

In my efforts to figure out how to consume a REST API using Python, I found the following sites helpful:

  • http://www.pythonforbeginners.com/web/how-to-access-various-web-services-in-python
  • http://docs.python-requests.org/en/latest/
  • https://pip.pypa.io/

In particular, be sure you have a functional Python install with the "requests" module installed and ready to use. From what I can tell, the "json" module is part of the recent Python versions so no extra step should be required to install that.

The Script

The following code should be saved as something like runWorkflow.py:

# Update for your environment here. Use single quotes around each defined value.
usr = {REPLACE WITH YOUR VRO USERNAME}
pwd = {REPLACE WITH YOUR VRO PASSWORD}
wfid = {REPLACE WITH YOUR VRO WORKFLOW ID}
#NOTE: use double \\ or single / when specifying file path
jsonFile = {REPLACE WITH PATH TO JSON BODY FILE}
vroServer = {REPLACE WITH VRO URL:PORT}

##### Make no changes below this line ####
# Import the modules to handle HTTP calls and work with json:
#
# requests: http://docs.python-requests.org/en/latest/user/install/
# To install the "requests" module, python -m pip install requests
# json (http://docs.python.org/2/library/json.html)
#
#####
import requests, json

# Create basic authorization for API
vroAuth = requests.auth.HTTPBasicAuth(usr,pwd)
# Set headers to allow for json format
headers = {'Content-Type':'application/json','Accept':'application/json'}
url = 'https://' + vroServer + '/vco/api/workflows/' + wfid + '/executions'
data = open(jsonFile).read()
# NOTE: verify=False tells Python to ignore SSL Cert issues
# Execute a workflow using a json file for the body:
r = requests.post(url, data=data, verify=False, auth=vroAuth, headers=headers)
Note: Before attempting to run the script, be sure to modify the parameters at the beginning of the script to reflect YOUR workflow.

I have kept the script as simple as I can, including the option to NOT Verify the SSL Certificate so that self-signed certs do not prevent the script from running.

The json file

While testing this, my example workflow was the "Create a Record" workflow as noted earlier. That workflow had a number of inputs so these were loaded into a json file. In order to learn more about how to get the correct input json for your workflow, please reference my article: How to use the REST API to Start a Workflow

Here's my example json file:

{
    "parameters": [
        {
            "type": "string",
            "scope": "local",
            "name": "ServerName",
            "value": {
                "string": {
                    "value": "vcac-iaas-01c"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerIP",
            "value": {
                "string": {
                    "value": "192.168.110.89"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerDNS",
            "value": {
                "string": {
                    "value": "corp.local"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerID",
            "value": {
                "string": {
                    "value": "vcac-iaas-01d.corp.local"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerOwner",
            "value": {
                "string": {
                    "value": "administrator"
                }
            }
        }
    ]
}

Example Run

example_run.png

I ran the following from my Windows desktop command prompt:

C:\tools\python>python e:\runWorkflow.py
C:\tools\python\lib\site-packages\requests\packages\urllib3\connectionpool.py:734: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)

When I check the Orchestrator client, I can see that the workflow has run and completed successfully (See screenshot above!)

Help Me Make This Sample Better!

As noted at the beginning of this article, I don't know Python but I managed to get a very simple example script working here. Please submit your comments and/or improvement suggestions to help increase the value and usefulness of this simple script.
Improvements needed:
- Get credentials from a hidden config file
- Pass in all parameters as command line arguments
- Display/parse output
Thanks!

How to use perl to start an Orchestrator Workflow

$
0
0

In a previous article, I taught you how to explore and use the REST API to start a Workflow using a generic browser based REST Client. In this article, I will provide a perl based example of running the "Create a Record" workflow that was created in Part 2 of my SQL Plug-in Dynamic Types Simple CMDB for vCAC article. I have barely more experience with perl than Python so this will be another very short article! You may download the script in this article from my vroClientScripts Repository on GitHub.

 

The Script

The following code should be saved as something like runWorkflow.pl - I tried to maintain similar structure as my Python article:

#!/usr/bin/perl
use REST::Client;
use MIME::Base64;
use File::Slurp;

# Update for your environment here. Use double quotes around each defined value.
$usr = {REPLACE WITH YOUR VRO USERNAME}
$pwd = {REPLACE WITH YOUR VRO PASSWORD}
$wfid = {REPLACE WITH YOUR VRO WORKFLOW ID}
$jsonFile = {REPLACE WITH PATH TO JSON BODY FILE}
$vroServer = {REPLACE WITH VRO URL:PORT}

###### Make no changes below this line ##########
# Disable server verification (for older LWP implementations)
$ENV{PERL_LWP_SSL_VERIFY_HOSTNAME}=0;

# Setup the connection:
my $client = REST::Client->new();
# Disable server verification (for newer LWP implementations)
$client->getUseragent()->ssl_opts( SSL_verify_mode => SSL_VERIFY_NONE );

$client->setHost("https://$vroServer/vco/api");
$client->addHeader("Authorization", "Basic ".encode_base64( $usr .':'. $pwd ));
$client->addHeader("Content-Type","application/json");
$client->addHeader("Accept","application/json");

# Perform an HTTP POST on the URI:
$client->POST( "/workflows/$wfid/executions", $jsonFile);
die $client->responseContent() if( $client->responseCode() >= 300 );
print "Response Code: " . $client->responseCode() . "\n";
my @headers = $client->responseHeaders();
foreach (0..$#headers){
    print $headers[$_] . ": " . $client->responseHeader($headers[$_]) . "\n";
}

 


Note: Before attempting to run the script, be sure to modify the parameters at the beginning of the script to reflect YOUR workflow.

I have kept the script as simple as I can, including the option to NOT Verify the SSL Certificate so that self-signed certs do not prevent the script from running.

The json file

While testing this, my example workflow was the "Create a Record" workflow as noted earlier. That workflow had a number of inputs so these were loaded into a json file. In order to learn more about how to get the correct input json for your workflow, please reference my article: How to use the REST API to Start a Workflow

Here's my example json file:

{
"parameters": [
{
"type": "string",
"scope": "local",
"name": "ServerName",
"value": {
"string": {
"value": "vcac-iaas-01c"
}
}
},
{
"type": "string",
"scope": "local",
"name": "ServerIP",
"value": {
"string": {
"value": "192.168.110.89"
}
}
},
{
"type": "string",
"scope": "local",
"name": "ServerDNS",
"value": {
"string": {
"value": "corp.local"
}
}
},
{
"type": "string",
"scope": "local",
"name": "ServerID",
"value": {
"string": {
"value": "vcac-iaas-01d.corp.local"
}
}
},
{
"type": "string",
"scope": "local",
"name": "ServerOwner",
"value": {
"string": {
"value": "administrator"
}
}
}
]
}

 

Example Run

example_run_27cd44547fcdcbf3393751ed6619fcf6.png

I ran the script from my OSX console, here is the output:

sh-3.2# ./runWorkflow.pl
Response Code: 202
Connection: close
Date: Thu, 29 Jan 2015 21:18:53 GMT
Location: https://vco55.vcoteam.lab:8281/vco/api/workflows/776ce656-1a3d-45da-a617-ba756295d96f/executions/ff8080814abac606014b378f9d370487/
Server: Apache-Coyote/1.1
Content-Length: 0
Client-Date: Thu, 29 Jan 2015 21:18:53 GMT
Client-Peer: 192.168.1.22:8281
Client-Response-Num: 1
Client-SSL-Cert-Issuer: /C=US/O=VMware/OU=VMware/CN=vco55.vcoteam.lab
Client-SSL-Cert-Subject: /C=US/O=VMware/OU=VMware/CN=vco55.vcoteam.lab
Client-SSL-Cipher: AES128-SHA
Client-SSL-Socket-Class: IO::Socket::SSL
Client-SSL-Warning: Peer certificate not verified

When I check the Orchestrator client, I can see that the workflow has run and completed successfully (See screenshot above!). Note that the "Location" header is the workflow token - that is what you would check for completion status as well as output parameters.

Help Me Make This Sample Better!

As noted at the beginning of this article, I don't know perl but I managed to get a very simple example script working here. Please submit your comments and/or improvement suggestions to help increase the value and usefulness of this simple script.
Improvements needed:
- Get credentials from a hidden config file
- Pass in all parameters as command line arguments
- Display/parse output
Thanks!

How to use PowerShell to start an Orchestrator Workflow

$
0
0

Okay now I have provided Python and perl articles to start a vRealize Orchestrator (vRO / vCO) workflow via it's REST API so now it's time for a PowerShell script. For this article, I followed the same format as the previous two BUT provided the option to call the script with command line parameters! You may download the script in this article from my vroClientScripts Repository on GitHub.

 The Script

The following code should be saved as something like runWorkflow.ps1:

Param(
    [string]$usr = 'myvROUser',
    [string]$pwd = 'myPassword',
    [string]$vroServer = 'vRO-Server.domain.lab:8281', # in format FQDN:PORT
    [string]$wfid = '2a2c773d-4f34-422e-b427-eddce95669d1',
    [string]$apiFormat = 'json', # either xml or json
    [string]$inputFile = 'e:body.json'# path to input file (either json or xml)
)
#### Make no changes below this line ###############################
# Usage:
# If you run the script with no parameters specified, the default values defined above will be used.
# to run with params, See following example: (Should be all one line)
# NOTE: It is not required to specify name of each parameter, but order will need to match the order in the above params section
# PS E:\> .\runWorkflow.ps1 -usr vcoadmin -pwd vcoadmin -vroServer vro-server.domain.lab:8281 -wfid 2a2c773d-4f34-422e-b427-eddce95669d1 -apiFormat json -jsonFile e:body.json
#
####################################################################

add-type @"
    using System.Net;
    using System.Security.Cryptography.X509Certificates;
    public class TrustAllCertsPolicy : ICertificatePolicy {
        public bool CheckValidationResult(
            ServicePoint srvPoint, X509Certificate certificate,
            WebRequest request, int certificateProblem) {
            return true;
        }
    }
"@

[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy

function ConvertTo-Base64($string) {
   $bytes  = [System.Text.Encoding]::UTF8.GetBytes($string);
   $encoded = [System.Convert]::ToBase64String($bytes); 

   return $encoded;
}

$token = ConvertTo-Base64("$($usr):$($pwd)")
$auth = "Basic $($token)"

$headers = @{"Authorization"=$auth;"Content-Type"="application/$($apiFormat)";"Accept"="application/$($apiFormat)"}
$body = Get-Content $inputFile -Raw
# write-output "Using body: " + $body
$URL = "https://$($vroServer)/vco/api/workflows/$($wfid)/executions"
Write-Output $URL
$ret = Invoke-WebRequest -Method Post -uri $URL -Headers $headers -body $body
$headers = $ret.Headers
ForEach ($header in $headers){
    Write-Output $header
}

Note: Before attempting to run the script, be sure to modify the parameters at the beginning of the script to reflect YOUR workflow.

I have kept the script as simple as I can, including the option to NOT Verify the SSL Certificate so that self-signed certs do not prevent the script from running.

The json file

While testing this, my example workflow was the "Create a Record" workflow as noted earlier. That workflow had a number of inputs so these were loaded into a json file. In order to learn more about how to get the correct input json for your workflow, please reference my article: How to use the REST API to Start a Workflow. Please note that the input file could just as easily be an XML file, you would just need to change the input parameter apiFormat to have a value of xml instead of json. That parameter simply tells the script which format to use for the Accept and Content-Type headers.

Here's my example json file:

{
    "parameters": [
        {
            "type": "string",
            "scope": "local",
            "name": "ServerName",
            "value": {
                "string": {
                    "value": "vcac-iaas-01c"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerIP",
            "value": {
                "string": {
                    "value": "192.168.110.89"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerDNS",
            "value": {
                "string": {
                    "value": "corp.local"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerID",
            "value": {
                "string": {
                    "value": "vcac-iaas-01d.corp.local"
                }
            }
        },
        {
            "type": "string",
            "scope": "local",
            "name": "ServerOwner",
            "value": {
                "string": {
                    "value": "administrator"
                }
            }
        }
    ]
}

Example Run

example_run.png

I ran the following from my Windows desktop PowerShell prompt:

PS E:\> .\runWorkflow.ps1 -usr vcoadmin -pwd vcoadmin -vroServer vco-server.domain.lab:8281 -wfid 2a2c773d-4f34-422e-b427-eddce95669d1 -apiFormat json -jsonFile e:body.json
Key                 Value
---                 -----
Content-Length      0
Date                Fri, 30 Jan 2015 18:04:06 GMT
Location            https://vco.domain.lab:8281/vco/api/workflows/2a2c773d-4f34-422e-b427-eddce95669d1/executions/ff8080814b27270a014b3c03a3950419
Server              Apache-Coyote/1.1

When I check the Orchestrator client, I can see that the workflow has run and completed successfully (See screenshot above!)

Help Me Make This Sample Better!

As noted at the beginning of this article, I don't know Python but I managed to get a very simple example script working here. Please submit your comments and/or improvement suggestions to help increase the value and usefulness of this simple script.

Thanks!

Create a plug-in for a REST based web service in minutes - part 1

$
0
0
In a previous article I have explained how Dynamic Types work and how these are very useful to create a vCO / vRO plugin that will enable the XaaS capabilitites of vCAC / vRA.
Then explained how to build your own twitter plug-in using the plug-in generator package. I have now extended the capabilities of the plug-in generator and will attempt to demonstrate these in this new series of articles. This article use NSX as the orchestrated endpoint but following the explanation included on this tutorial you should be able to get it to work with many REST web service.

Pre- requisites
  • The system to be Orchestrated via the plug-in must have a REST based interface that works with the REST plug-in.
  • Make sure that you have the minimum version of the DynamicTypes plug-in and of vCO as specified on VMware communities.
  • Import the v2 package to your vCO client.
  • Import icons for the inventory objects you will want to create as resource elements. In case you do not find these you will still be able to use some defaults icons but having nice icons makes things so much prettier.If you need the NSX icons I have posted these on communities.

Warning : Not a single line of code is required for generating a plug-in when using the plug-in generator - if you landed here looking for complex development tasks taking several days of work and requiring a complex development environment you will be frustrated by the efficiency and simplicity of this tutorial !

To create a new plug-in, go in the inventory view, unfold Dynamic Types / Type Hierarchy. Right click on Type hierarchy / run the workflow Plug-in Gen - Create new plug-in.





Enter a plug-in name and select an icon.




Submit. If the "Running workflow" windows stays open click on : "Run in background"



To create the inventory objects we first need to add a new host. Right click on the newly created namespace and run the "Plugin gen - Add new host" workflow




The next screens are exactly the same as the library workflow to add a REST host. Provide a name and URL.



On the next screens you will have to enter proxy settings if required and authentication types and credentials.

Once you have submitted you can check that you have a new host in the HTTP-REST inventory. You can even right click / add a REST operation and then invoke it to make sure everything work as expected.


We are now ready to create a type. Unfold the Type Hierarchy and right click on the plug-in namespace host type / run the workflow Plugin gen - Create a type



After the presentation finish loading (this can be from 20 s to 1 min).

 


  • Enter a type name. This should be all lower case, singular and alpha only.
  • Select an icon for the type (If you do not have any specific you can use the "item-16x16")
  • Choose a Folder name. This is going to be the name of the folder showing in the inventory. You can use Upper case for the first letter, plural and spaces if needed.
  • By default the same icon as for the type will be used for the folder. If you decide otherwise the default icon will be a folder that you may change to a more specific folder icon.
  • Object cache is the time for which this object type will stay in the cache. This means that once the plug-in get an object from the remote system it will store the result for this time and not request it again. This will improve the performance by a lot in the vCO UI but also in the workflows manipulating the objects, particularly with array of objects or if using loops within the workflow. A lower value will maintain the inventory more current against any changes not done by vCO but also slow down performance. A too high value would mean a not so current inventory which may cause errors in a workflow if for example there is an attempt to modify an object that was deleted outside of vCO.

On the next screen you can choose the host that will be used interactively to create the object. If you created a single host it should be set by default. Click next.




If you are in a hurry just click next.

The relation types screen define how many URL you can provide to get the object.
It is now time to read your endpoint API guide and search how to get the object.

In my case the /api/4.0/edges URL gets all the edges objects and the /api/4.0/edges/{edgeId} URL get a single edge.
The findRelation URL will get all the objects under the Edges folder. Since this is a 1rst child of the host these will be all the edges using the /api/4.0/edges URL. The findRelation URL is always mandatory.
In a similar way the findAll URL will be using the /api/4.0/edges URL. FindAll is used to list all objects when selecting one for a workflow input.
The findById URL needs to be specified with a parameter in curly braces {}.

If you do not provide a findAll URL findRelation will be used on all the objects that are of the parent type of the object you are looking for.
If you do not provide a findById URL findAll will be used to get all objects and be iterated until finding the one having a matching ID.

To implement quickly a plug-in you can use NOT to use the findById and findAll URL. This will shorten the creation of a single object and avoid the case where the API returns different object properties for the different URLs. Also some APIs do not provide an URL for findAll or findById but only for findRelation. If you have further time to implement, an API that implements the different URLs and returns the same object properties than you should definitely use findById and findAll URLs since this will improve the queries performance a lot.

The version 1 of the plug-in generator required all URL to be filled making impossible to implement objects with API not providing one of these URLs.

On the next screen you will have to set the findRelation properties.




Enter the URL. Likely this will be an object under the host object and the inventory and the URL will be the same as the one to find all objects.
Once you will set the focus on the next input the response should appear. It should be in JSON format. Do not edit the response content. Instead copy and paste the response content, open a web browser and look for a JSON viewer or JSON editor online. I like to use http://jsonviewer.stack.hu. Paste your JSON in the text area and check the viewer.

The "Use action ..." is an alternative way to get the object properties requiring you to write an action taking as input a json string and as output an array of properties object. It is used when performance is key or when the object has complex properties requiring that need to be parsed / transformed programmatically. I will cover this case in another article since it is not needed in most cases.

You may wonder what happen if the REST web service send a response in the XML format. The plug-in generator des a request with providing a header requesting for JSON format. This works for example with the NSX API. When this is not supported the plug-in generator use an XML2JSON method included in the REST plug-in to do the conversion.


The JSON viewer will show something like this. Unfold the tree to recognize the different elements.



What you are looking for is the path to the array [] of objects {} that are returned. Some APIs will return the array of objects at the root element while other will have intermediary objects. Once you have identified this path get to the next screen.



The objects selector or object accessor is the path you have identified. If the array of object included child objects enter the path starting with a "." and separated with "." as if it was a javascript object. In my case the array of objects is under the path .edgePage.data
The preview size limits the number of objects returned in the objects preview. This was designed to avoid to get very large number of objects since it is not needed to identify the properties we will need.
After using tab to get from one field to another the objects preview should be populated. if you copy and paste the content to the JSON viewer it should look similar to this:



With the objects selector entered the objects will be found automatically for each request happening. Get on the next screen to define the objects properties selectors.




The list of properties selectors will be set bu default based on the properties that were returned by the objects previously. You have to edit the properties selector to:
- Check that it includes only properties you need.
- Check that it include at least an id and a name properties

In my case it is missing the id property. Click on the insert value.


To determine which property can be used as an unique ID for the object look at the object properties in the JSON viewer. In my case this property is objectId so the accessor is .objectId.
Once you accept the change the properties preview should be filled. You can copy and paste the content to a text editor to check the properties are retrieved as expected.
The object properties that are not simple types may not be displayed in a visible form for example it may list [object Object]. Since Dynamic Types only support string properties in the object such properties will be made available in the object in a JSON format. This allows to convert these properties to a javaScript object in a vCO scriptable task using var object = JSON.parse(objectProperty);


If you had selected to provide findById or findAll URL you will have to go through similar steps to provide object and properties accessors. Once done you will be asked if you want to merge or intersect the properties returned by the different URLs.

If not you will see the list of object properties. You can submit the workflow and click run in background.

You can now check the Type Hierarchy in the inventory.




The type was created under a Folder type created under the host type.

Unfold the namespace.



Under the namespace there is your host instance having for child the folder instance having for child the objects instances. Managing multiple hosts is a new feature of the version 2 of the plug-in generator.

If you click on one of the object and check the right pane for its properties.




All the properties are listed. As explained above the complex ones are in JSON format.
If you look for the id property it will be in the format hostId/objectID so there won't be any conflict when using more than a single host.


You managed to create your own plug-in object in minutes without typing a single line of code. The next article in the series will walk though creating objects that have for parent another object type different than the host type.

If you want to create other objects for an NSX plug-in you can run the Create type workflow using the following parameters:

Type name

Icon

Folder name

FindRelation URL

Objects selector

Properties

datacenter

datacenter.png

Datacenters

/api/2.0/vdn/scopes/vdnscope-1

.clusters

id: .cluster.scope.id name

name : .cluster.scope.name

edge

edge.png

Edges

/api/4.0/edges

.edgePage.data

 

securityGroup

securityGroup.png

Security Groups

/api/2.0/services/securitygroup/scope/globalroot-0

 

id: .objectId

transportZone

transportZone.png

Transport Zones

/api/2.0/vdn/scopes

.allScopes

 

securityTag

securityTag.png

Security Tags

/api/2.0/services/securitytags/tag

.tagList

id: .objectId

securityPolicy

securityPolicy.png

Security Policies

/api/2.0/services/policy/securitypolicy/all

.policies

id: .objectId

virtualWire

virtualWire.png

Virtual Wires

/api/2.0/vdn/virtualwires

.dataPage.data

id: .objectId





Create a plug-in for a REST based web service in minutes - Part 2

$
0
0

In the previous article I walk through step by step to create a plug-in object type leveraging the Plug-in Generator version 2.
In this article I will demonstrate one of the new feature of this version 2 : Create a child object type.


Having the plug-in inventory organized in a tree view allows to represent most of the objects hierarchies found in applications APIs.
Resuming from part 1 I need to create a vNIC object. It does not make any sense to list all the vNICs objects under the NSX host since each edge has up to 8 vNics that may have the same name.
The API guide provides the GET https://<nsxmgr-ip>/api/4.0/edges/<edgeId>/vnics that shows that a vNIC is obviously under an edge object.

In the type hierarchy right click on edge / Run "Plugin gen -3- Create a type"



After a few seconds the first screen will appear.



Enter a type name, select an icon, enter a proper folder name. Click next. Select the host you will interact with for the requests. Click next.




While this is optional this time we will select to use a findBy ID URL as well. This will reduce the number of requests done and increase performance.




The find relation URL is adapted from the API guide https://<nsxmgr-ip>/api/4.0/edges/<edgeId>/vnics
  • No need to include the https://<hostname>
  • The parameter has to be in curly brackets {}
  • The parameter name must be a type you defined
Note that validation will be done on the URL, the parameter name and order.
Once you will have entered the URL and hit tab the list of available edges should be listed for the parameter 1 field.

Copy and paste the response to a JSON viewer (I.E http://jsonviewer.stack.hu)




You can see that the result is not the vnics objects we were expecting but an error object. In this particular case the edge-1 is not supporting the /vnics API since the edge is of type distributedRouter.
Switch Parameter 1 to another edge.





This will update the response. Copy and paste the response to a JSON viewer



We now get an array [] of vnics objects {} under .vnics
Click next and use .vnics as objects selector, press tab



Copy and paste the response to a JSON viewer. Now we got the vnics properties. Note there is not an id property but an index one.
Click next.



Since the object is missing the mandatory id property the workflow validation will raise an error. Click on the properties selectors input.



As expected all properties are listed, including the mandatory name property but missing the id property. Click Insert value.


Use id for property and .index for accessor


The validation error disappear and the properties preview is populated. Click next.



Since we chose to use a findById URL to get better performance in the plug-in we get to another round interacting with the target server.



  • The URL is the same as the one we used for findRelation except is has the extra /{vnic} to get a single vnic. The name of the last parameter must be the same as the type name we defined on the first screen. Otherwise you will get a validation error.
  • The parameter 1 must be again set to an edge different form the edge-1 since it does not support the /vnics API
  • The parameter 2 can be any of the configured NICs. While the non configured NICs are listed as well a configured NIC will provide a better response sample.

You can copy and paste the response in a JSON viewer to check the properties. You can notice the JSON contains directly the object properties without any intermediate object.


Click next. This time there is nothing to do since there is no need to enter the path of the properties in the object accessor input.




Click next. As before the id property is missing.




Edit the properties accessors input and insert a new value



Use id for property and .index for accessor. The properties preview should now be populated. Click next.





Since we are getting object properties from two methods / URLs we may get different properties (it seems like bad design but believe me it does happen !).
This last screen defines the properties of the object you are creating.
Intersecting properties will only use the ones that are common for the different methods. Merging properties aggregate them (in this case they may not be filled depending which method is called).
To prevent from having properties discrepancies there are a few workarounds:
  • If you see different properties names get the same property values rename them when defining the properties accessor so they can be intersected
  • You may use your own action (this will be the subject of another article) to create get properties programmatically. This offers a lot more flexibility
  • If performance is not an issue (small set of objects) and if the URL you define in findRelation gets the property you need only define findRelation
In the case of the vnic it does not matter since the properties are the same when getting all vnics for an edge (FindRelation) or when getting a specific one (findById)

Submit your workflow. Click run in background.


Check your type hierarchy.



You now see the vnicFolder and vnic under the edge type.


Unfold the namespace / host / edges.



You now see listed the Nics folder that has objects under the second edge in my case but not the first one (This was the one where the edge of type distributed router did not support the /vnic API - if you need these you need to go through this workflow again using /interfaces at the end of the URL).

So basically with following the part 1 and 2 of this plug-in generator series you should be able to create your own plug-in inventory using the plug-in generator "Create a type workflow"' and eventually get to something like this in a couple of hours:




In the next part we will see how to create, operate, delete the objects we have in the inventory.
Viewing all 43 articles
Browse latest View live