Why every organisation needs a rebel

Theo Priestly, on the value of rebels:
The trouble with SMEs is that they are invariably as stuck in their ways as any of us are. They know their part of business inside out for sure, to the letter in fact sometimes, but that’s not necessarily a good thing. Because it invites constraint, lack of creative thinking, rigidity. Comfort. ...

The trick has always been to spot the ones bending and breaking the rules or process. They’re the ones that will champion the business to change. Not SMEs.

...

Rebels are rule-benders and rule-breakers who are more tuned into the art of the possible.

Why Every Organization Needs A Rebel

One thing I've learned in 4 years of working in a Fortune 500: Stay curious, keep up to date on emerging technologies, and don't be afraid to bend the rules. Just make sure you don't put anyone's nose out of joint :)    

Remove snapmirror relationships en-mass with NetApp’s PowerShell tools

Anyone who's tried to remove Snapmirror relationships using the NetApp commandline knows how painful it is. Recently, I had a need to remove all snapmirror relationships from a number of NetApp storage systems and figured I'd play with NetApp's PowerShell toolkit to see if I could semi-automate the process. Below are some notes and some code snippets that may help if you ever need to do this yourself.

Environment Setup

If you haven't got them already, download and install PowerShell 4.0 and the latest NetApp DATAONTAP Powershell toolkit Open up PowerShell and run:
 import-module DATAONTAP

If you get an error about not being able to execute scripts, run:

 Set-ExecutionPolicy -ExecutionPolicy RemoteSigned

Scenario

I have a number of storage systems, each with many snapmirror relationships with other storage systems. All storage systems will accept the same credentials. I want to remove all snapmirror relationships associated with the destination storage systems. We'll be using the following NetApp cmdlets:
Connect-NaController
Get-NaSnapmirror
Invoke-nasnapmirrorquiesce
Remove-NaSnapmirror
To find out more about these use:
get-help <cmdlet name> -full

Scripting

First, you may wish to setup an authentication token. I like using this method as it's relatively secure, and doesn't involve embedding passwords into scripts
$authentication = Get-Credential
This asks for the credentials via an authentication popup, and stores those credentials in a new variable called $authentication that we can pass to the -Credential parameter when connecting to a storage system later on. Lovely. Let's connect to our destination storage system (that is, the one whose snapmirror relationships we want to kill with fire remove cleanly)
Connect-NaController -Name <storage system name> -Credential $authentication

OK, now, we need to get a list of all the snapmirror relationships on the storage system and pass those results to the command that will quiesce all of those relationships

Get-NaSnapmirror | foreach-object { invoke-nasnapmirrorquiesce -Destination $_.Destination }

This may take a while.

Once it's done, this next code snippet will get that same list of snapmirror relationships and then:
  1. Grab the object in the Source field and use Regular Expressions to isolate the source storage system and put it in its own variable
  2. Pass that source storage system variable as well as the Source and Destination details to the Remove-NaSnapmirror cmdlet
  3. Use the authentication token to authenticate against the source storage system (in order to release the snapmirror relationship)
Get-NaSnapmirror | foreach-object {
#remove all characters after the : from the output of the source. This gives us the filer name that we can then pass to the Remove-nasnapmirror command
$sourcefiler = $_.Source -Replace '\:.*'
Remove-NaSnapmirror -destination $_.Destination -source $_.Source -sourcecontroller (connect-nacontroller $sourcefiler -credential $authentication -transient)
}

For every relationship to be removed, you'll be asked if that's really what you want to do. For safety's sake, leave that in place. If you're feeling cavalier, you can automatically say yes to that prompt by replacing the above Remove-NaSnapmirror line with:

Remove-NaSnapmirror -destination $_.Destination -source $_.Source -sourcecontroller (connect-nacontroller $sourcefiler -credential $authentication -transient) -confirm:$false

Your mileage may vary. Let me know how you get on!

Free IT Ops online training courses from Pluralsight until 23rd August

I thought I would share this somewhat rare event in the world of IT Training. Running from now until the 23rd August, the online IT training provider Pluralsight are opening up 36 of their online courses for free, to anyone signing up.

Sign up

To sign up for free, go here: https://offers.pluralsight.com/summer

What’s available?

Full list: Camp Pluralsight main page Most relevant to those of us in IT would probably be the IT Ops track, which includes the following courses: IT Ops Beginner  Intermediate I’ve used Pluralsight both personally and professionally and really like their courses. If you get something out of this, I’d love to hear about it. Share and enjoy :)

The “why” is important

I like why. It helps me understand context. But in an increasingly email-laden world where everyone is "too busy" and emails are becoming more and more concise, the why is often the first thing to be omitted. And this is a freaking travesty. The "why" helps people understand why something is happening and the impact that it has. It gives context and, if it's a request, it enables the recipients to use their own judgement and experience to assist above and beyond what's originally being asked for (aka adding value). Not only that, but understanding the why makes people care more.

Examples

Here's a few examples of how adding the why helps with context and understanding: IT Maintenance Window (Understanding) Too-busy:
Windows servers will be down for maintenance over the weekend. You won't be able to access your files during the outage window.
 With the why:
In order to help keep our computer systems secure from outside threats (such as hacking) and to keep our systems more stable, we will be installing the latest software updates to our Windows servers over the weekend. You won't be able to access your files during the outage window, but we will let you know once the outage is complete.
Storage Archive Request (Adding Value) Too-busy:
Your project XYZ on the Moon has been inactive for 3 months. Please could you archive project directory XYZ?
With the why:
We're working to free up infrequently used space on our storage systems to help us reduce our storage purchasing needs for the Moon over the next quarter.
We've identified that your project XYZ has been inactive for 3 months. Please could you archive project directory XYZ?
Now, armed with this extra information, I might then start to think about other project areas that could be archived in the future and queue those up. Doing so will help further optimise the storage space and save even more money (the added value) Give your colleagues a chance, and include the why :)

Connecting to multiple storage systems with the NetApp PowerShell Toolkit

Scope This post discusses methods to connect to multiple NetApp Storage Systems using the NetApp PowerShell Toolkit, and then execute commands against those storage systems Background OK, so you've setup the NetApp PowerShell Toolkit, and you can connect to a storage system and run some commands. But what if you have more than one storage system? Conceptual overview As with all things scripting, there's many ways to achieve this, but here's roughly what I'll be blogging about below:
  1. Gettting a list of storage systems:
    1. Via Text File: Keep an easily-accessible text file up to date with all your storage systems
    2. By enumerating from Active Directory: Query AD to get a list of all your storage systems (providing they're in AD!)
  2. Setup authentication details for all the storage systems
  3. Connect to the systems and run commands
Getting a list of storage systems Text file method This is the simplest and sweetest method, but doesn't scale too well in a large organisation, so buyer beware ;) Create a text file in an easily accessible location, such as your home directory or Desktop and populate it with a carriage-returned list of hostnames. For example:
cylon82
cylon814
Knowing where the text file is located, we can now setup the base for our PowerShell script:
Where's the list?
$ListLocation = "C:\Users\Phil\Desktop\filers.txt"
#Read the contents of the list into an array
$FilerList = Get-Content $ListLocation
Now, to see what's inside the $FilerList array, just type:
echo $FilerList
You should now see a list of NetApp Storage Systems, like so:

Enumerate from Active Directory Another way to get a more dynamic list of storage systems is to use an Active Directory lookup In the example below, all the storage systems are called roughly the same thing. Many organisations have naming schemes, such as: na-site-number (for example, na-cbg-001). If you always call your storage systems the same thing, you can query AD and locate all your filers, like so:
 import-module ActiveDirectory
 #what are we searching for?
 $FilerNameScheme = "na-site-*"
 #Let's query AD for systems with a name like $FilerNameScheme, and filter only Enabled accounts, and then only show the shortname for the host
 $FilerList = Get-ADComputer -Filter {(Name -like $FilerNameScheme) -and (Enabled -eq "True")} | Select -Expand Name
Now, to see what's inside the $FilerList array, just type:
echo $FilerList
You should now see a list of NetApp Storage Systems! Authenticating against multiple storage systems OK great. Now that we have a list of storage systems, we can use the $FilerList array to authenticate to all the storage systems:
 #Let's setup the login credentials we'll be using for the storage systems
 $authentication = Get-Credential
 
 foreach ($filer in $FilerList) { 
 
 #add authentication for this filer, using the credentials we just entered
 Add-NaCredential -Name $filer -Credential $authentication 
 
 }
At this point, you'll be shown a list of all storage systems with authentication credentials stored. For example:

Running Commands Now that you've got a list of filers inside the $FilerList array and credentials queued up, you can start to run commands against them all using the simple template below. I'll throw some examples in too, in case it helps:
 #for each entry in the list, run these commands…
 foreach ($filer in $FilerList) { 
 
 #connect to the storage system
 Connect-NaController -Name $filer
 
 #Put your commands here, for example...
 #show system version
 get-nasystemversion | format-table
 #show volumes and aggrs
 Get-NaVolContainer ; Get-NaVol | format-table
 # show me the following options settings
 get-naoption cifs.smb2.enable cifs.smb2.client.enable cifs.tcp_window_size cifs.oplocks.enable
 

}
Putting it all together Here's an example of a full script which uses a text file for a list of filers, sets up login credentials, and then runs a few commands against each system.
#Where's the list?
$ListLocation = "C:\Users\Phil\Desktop\filers.txt"
#Read the contents of the list into an array
$FilerList = Get-Content $ListLocation

#Let's setup the login credentials we'll be using for the storage systems
$authentication = Get-Credential

foreach ($filer in $FilerList) { 
 
#add authentication for this filer, using the credentials we just entered
Add-NaCredential -Name $filer -Credential $authentication 
 
}

#for each entry in the list, run these commands...
foreach ($filer in $FilerList) { 
 
 #connect to the storage system
 Connect-NaController -Name $filer
 echo "Connecting to $filer"
 #show system version
 get-nasystemversion | format-table
 #show volumes and aggrs
 Get-NaVolContainer ; Get-NaVol | format-table
 # show me the following options settings
 get-naoption cifs.smb2.enable cifs.smb2.client.enable cifs.tcp_window_size cifs.oplocks.enable
 
}
And finally, here's an example of the output you'd see:

How was that?

How was that? Let me know in the comments what you're using it for. I'd love to know! :)