Jun 04

NetApp Dynamic Storage Types Package

I am pleased to announce the NetApp Dynamic Storage Types Package. The NetApp Dynamic Storage Types Package will allow setting NetApp storage as an output for vRealize Orchestration (vRO) workflows and in turn as an output for vRealize Automation life cycle management.  This is done by taking advantage of the native dynamic plugin capability in vRO 5.5.2 and above.

In the Create a Clustered Data ONTAP NFS Volume w/QoS workflow (acquired through the NetApp OnCommand WFA and VMware vRealize Orchestration Workflows Sample Pack and shown below) an output parameter of wfaVolume has been setup.

dymanicTypes workflow example

When looking at the Outputs of this workflow you will see that wfaVolume has DynamicTypes: NetApp.WFA.Volume as its Type.

dynamicTypes workflow example 2

What this then allows for is the tracking of NetApp storage creation in vRealize Orchestrator and the creation of an an output associated with NetApp storage.  When this workflows runs it says “I just created a NetApp WFA Volume” Basically, visibility of NetApp storage into vRO.

This functionality and visibility then moves up the stack to vRealize Automation.  In vRA under Advanced Services you can then create custom resources that mirror these storage types; in this case Cluster, SVM, and Volume.  As shown in the example below NetApp Cluster, NetApp Storage Virtual Machine (SVM), and NetApp NFS Volume all correspond to dynamic storage types created using this package.

dynamicTypes vRA example 3

At the baseline this grants visibility of NetApp storage into the vRealize Orchestration level, which is a great thing to have.

To access the package please click on the link provided below. Included in the compressed file are setup instructions.

NetApp Dynamic Storage Types Package

As always, I welcome any feedback or comments you may have.

Thank you for your time.

-McCloud

Share Button

Jun 02

NetApp OnCommand Workflow Automation and VMware vRealize Orchestration Workflow Sample Pack

We are pleased to announce the NetApp OnCommand WFA and VMware vRealize Orchestrator workflows sample pack. This sample workflow pack has multiple workflows that allow for storage automation from VMware vRealize Orchestration to NetApp OnCommand Workflow Automation over the REST interface created using the NetApp vRealize Integration Package for OnCommand WFA 3.0 package previously published.  This sample pack is intended to be both functional and to act as a demonstration of how to create workflow frameworks in vRO to access WFA storage automation workflows.  The compressed file contains a vRealize Orchestrator workflow package, another compressed file with complimentary NetApp OnCommand WFA workflows (as well as additional finders and filters needed for full functionality), and setup instructions for both the vRO package and the WFA workflows.

To access the package please click on the link provided below.  

 NetApp OnCommand WFA and VMware vRealize Orchestration Workflows Sample Pack

As always, I welcome any feedback or comments you may have.

Thank you for your time.

-McCloud

Share Button

May 18

NetApp vRealize Integration Package for OnCommand WFA 3.0

We at NetApp have just released the third version of the NetApp vRealize Integration Package for OnCommand WFA 3.0. This package integrates NetApp OnCommand Workflow Automation (WFA) with VMware’s vRealize Automation (vRA) (formerly vCloud Automation Center). The package is installed in vRealize Orchestrator (vRO) (formerly vCenter Orchestrator or vCO) and then setup to communicate over a REST interface with WFA.  The package allows for the creation of software-defined storage environments using vRO to automate virtual machine provisioning and WFA to automate the storage needed for the virtual machines.

There are several key enhancements to this package. The first is that it is faster and much easier to deploy than the last package. Also, it takes advantage of the enhanced REST interface ability in WFA 3.0 and can make direct calls to workflows in WFA. In the past the WFA database had to be accessed in order to attain drop down functionality in vRO (or vCO) equivalent to that found in WFA.  With this feature, direct REST calls to the workflow give that same functionality without the need for accessing the WFA database.  Setup is also now a breeze with one workflow that assists in setting the package up in vRO to communicate with WFA.  Also, the package is backwards compatible with previous versions of WFA (WFA 2.1 forward have been tested) and with previous versions of vRO (5.5.1 and forward have been tested).

The package can be found at the below link. You’ll need to setup an account to access the NetApp Private Communities in order to download it.  Also, as part of the .zip file download there is an installation guide included.

NetApp vRealize Integration Package for OnCommand WFA 3.0

If for some reason you cannot access the NetApp communities feel free to send me an email or tweet and I’ll be happy to email it to you.

As always, thank you for your time and please provide any feedback you’d like. I’m always happy to hear from you!

-McCloud

 

Share Button

Jul 22

Version 2.2 of the OnCommand Workflow Automation package for VMware vCenter Orchestrator Released

NetApp has just released version 2.2 of the WFA package for VMware vCO.  The primary improvement of this package is that it now allows the use of WFA 2.2 natively. This was done by adding a job status of “PLANNING”, which is a new job status in WFA 2.2  The package also now supports VMware’s vCenter Orchestrator 5.5.1  In order to understand how to setup this package please refer to  TR-4306: Building Automation and Orchestration for Software-Defined Storage with NetApp and VMware.  To download the package go to the NetApp Communities page or click the below link for a direct download.

 

NetApp OnCommand Workflow Automation Package for vCO v2.2.zip

Share Button

Jun 17

Two New Co-Branded Technical Reports

I’m pleased to announce two new jointly branded NetApp and VMware technical reports:  TR-4308: Software-Defined Storage with NetApp and VMware and TR-4306: Building Automation and Orchestration for Software-Defined Storage with NetApp and VMware.

TR-4308 describes the joint NetApp and VMware solution to implement a software-defined storage environment.  This environment is a critical component toward achieving an entire software-defined data center.  TR-4306 is a deployment guide that explains how to set up the joint NetApp and VMware solution to implement a Software-Defined Storage solution, which is a critical component in achieving an entire Software-Defined Data Center (SDDC). The described setup can be used to create an orchestration and automation environment that uses NetApp clustered Data ONTAP® storage, NetApp OnCommand® Workflow Automation (WFA), VMware vCloud Automation Center (vCAC) 6.0, and VMware vCenter™ Orchestrator (vCO) 5.5.

Take some time to read through them and let me know what you think.

As always I appreciate the time you have taken to read this post and I’d love to get your feedback on these technical reports and hear what you’d like to see.  Feedback on these TR’s will definitely help me to produce guides that will be more beneficial to my audience.

Thanks for reading!

-McCloud

 

Share Button

May 15

Software-Defined Storage with NetApp and VMware – Part 4: Advanced Workflow Design with Wrapper Workflows in vCO and WFA

Its taken me a while to get this post out. I have been swamped with multiple projects in the realm of automation and orchestration for Software-Defined Storage. With that said, lets get into how to create wrapped vCO workflows to call OnCommand Workflow Automation (WFA) workflows.

First, we should have a discussion on workflows by looking at the creation of a specific wrapped workflow.  We will use the creation of a VMware datastore as an example for taking using smaller, more specific workflows to create a larger, more complex workflow. In this example the end goal for the larger workflow would be to have a datastore with the following characteristics enabled: NFS, 10GB in size, deduplication enabled, QoS of 13MB/s enabled, a snapshot policy set, and added to each host in the cluster. We can accomplish this in two different ways.  First we can create all of the smaller workflows needed to create the overall larger workflows. To To accomplish this the following small storage workflows would need to be created:

  • Create a Clustered Data ONTAP NFS Volume
  • Add Deduplication to Volume
  • Add QoS Policy to a Volume
  • Create a Snapshot Policy for Volume
  • Create SnapMirror
  • Create SnapVault

The second method for creating a larger workflow would be to first create environment “packs” workflows.  An environment “pack” would add characteristics for specific environments like Oracle, Microsoft Exchange, or VMware to volumes as well as both backup and recovery and disaster recovery options.  With this method the workflows needed would be as follows:

  • Create a Clustered Data ONTAP NFS Volume
  • Add VMware Properties to Volume (would include deduplication, thin provisioning, and a predetermined QoS policy)
  • Add Backup and Recovery and Disaster Recovery Options to Volume (includes SnapShot policy, SnapMirror replication, and SnapVault Backup)

Obviously, this decreases the number of workflows needed to create specific volumes for specific workflows but slightly increases the complexity of the baseline workflows used. This method does still allow for a modular approach to creating larger workflows as well.

Pictured below is a series of workflows we have developed as part of a sample pack. Notice that the first four workflows above are workflows listed in the sample pack below.

image

Now lets look at how we accomplish the creation of the larger workflows. This larger workflows in vCO is known as a wrapped workflow since you are wrapping multiple smaller workflows into a larger workflow.

Building the Wrapper Workflow

Note:  The below steps are merely and example of how to create a wrapper workflow from three other vCO workflows that call WFA.  Each workflow is a different entity unto itself. Please use the below steps as a guide for building your own but not as an exact “how to” for every workflow. 

Start by opening a baseline workflow and click on edit. In this case we will open Create a Clustered Data ONTAP NFS Volume. This specific workflow creates a new NFS volume and either uses a default QoS policy or creates a new QoS policy.

image

As a recommendation it is suggested you make a copy of the baseline workflow to work with instead of using the actual workflow. Right click on the workflow you will be making a duplicate of, click “Duplicate workflow”. At the Duplicate workflow screen rename the workflow and place it in the correct folder.

image

image

Once you have done this right click on the duplicated workflow and click Edit. Go into the Schema tab and then on the left side menu bar select “All Workflows”.

image

image

image

Drag and drop Add Deduplication to Volume(s) and Add Thin Provisioning to Volume(s) workflows into the schema after the NetApp WFA Workflow Execution. Ignore the “Do you want to add the activity’s parameters as input/output to the current workflow” question at the top fo the screen when you do this.

image

Next, click on the edit button on the Add Deduplication to Volume(s) workflow that was added to the overall workflow. This will bring up the edit screen.

image

Click on “not set” next to clusterName. Then select ClusterName in the Chooser… screen. Then click Select. Perform the same step for vserverName, but select VserverName in the Chooser.. screen and then click Select.

image

image

For volumeGroup, click on “not set”. At the Chooser.. screen select Create parameters/attibutte in workflow (highlighted in Red below)

image

In the Parameter information screen, enure that the Name says volumeGroup and then click Ok.

image

For volumeList click on “not set”. volumeList is found in both of the new workflows that are being wrapped as a part of the larger workflow. volumeList refers back to and is equal to VolumeName. Therefore, at the Chooser… screen select VolumeName and then click Select.

image

When the final Local Parameter has been mapped, click on Close. You will be then be taken back to the Schema screen.

image

You will now need to edit Bind Inputs. Click on the edit button for Bind Inputs.

image

At the Edit screen, click on the IN tab at the top. Then click on Bind to workflow/parameter/attribute button.

image

At the Chooser… screen, click on volumeGroup and then click Select. volumeGroup will now be displayed In tab as a parameter.

image

image

Next, click on the General tab. You will see volumeGroup listed under Attibutes. This will need to be moved from Attributes to be an Input parameter. Right click on volumeGroup and select Move as INPUT parameter.

image

Click on the Inputs tab and verify that volumeGroup has now been moved.

image

The workflow should now be ready to test.

Pretty easy, eh?  By following the above procedures you too will be able to make your own wrapper workflows and extend the storage automation and orchestration into your SDDC.

Here are the links for the Software-Defined Storage with NetApp and VMware series:

As always I appreciate the time you have taken to read this post and I’d love to get your feedback and hear what you’d like to see.  Workflow ideas, blog post ideas, and general comments are all welcome.

Thanks for reading!

-McCloud

Share Button

Older posts «