How to Start Using PowerShell with SharePoint

By Corey Burke

Microsoft PowerShell I love PowerShell.
 

When you couple this powerful tool with a complex application such as SharePoint it really opens a world of interesting and creative ways of troubleshooting and data gathering. However, most struggle to determine where to start.

At a base level, SharePoint is made up of a configuration database, some service application databases, and a bunch of content databases. This is not discounting all of the binaries and file system objects that exist on each of the SharePoint servers, but the meat of SharePoint exists in SQL and is accessed through a robust .NET API.

 

I find the best place to start is to determine what kind of SharePoint data you are interested in working with, and if you are dealing with a large amount of data, decide what kind of data store would be best to use to store that information for later use. In the graphic below you can see the CMDLet Nouns for each level and what kind of data can be worked with at that level. This provides a good idea of the entry point you might want to use based on what you are trying to accomplish. Notice that Lists and Items don’t have a direct way to access them, which means SPWeb is really the lowest entry point to get to that data. 


Understanding SharePoint
 

 

Once you have an entry point in mind, the next step is to access an object at that level using the standard PowerShell Verb-Noun pair syntax. For example:

$site = Get-SPSite http://portal.contoso.com/sites/some_site

This will get you an object of the Site Collection at that URL and put it in the variable $site. From here you can use some very important CMDLET tools such as Get-Member or Get-Help to figure out the next logical path. Get-Member is used to get any methods and properties of your newly acquired object, and is used in the format:

$site | Get-Member

This will output a list of methods and properties that can then be accessed using dot notation, such as $site.webs to show a list of sub-webs in the site Collection.


So now we have some idea of where we want to start, and how to dig a little deeper to find the relevant data, but what do we do with the data once we find it? When I am working with large sets of relational data I like to use XML. I might use this if I am making a blueprint of web applications, their configuration objects, databases, site collections, etc. For day to day quick scripting I prefer CSV. It is a very portable format that is great for storing row-like sets of data, and it can be opened in Excel for quick editing before putting back into SharePoint.

Below is an example of how you can quickly create a CSV file using PowerShell:

# This creates an empty array. This will be used to create an array of rows to be written to the CSV file

$CSV = @()

# $newRecord is an empty object that we are creating with the columns First, Last, SSN, and DOB.
$newRecord = "" | select First, Last, SSN, DOB

# Now that we have the empty object, we are going to use dot notation to add data to each column of the record
$newRecord.First = "Corey"
$newRecord.Last = "Burke"
$newRecord.SSN = "555-55-5555"
$newRecord.DOB = "5/17/2905" #I’m from the future. I come in peace…
 
# Now that the record is populated, we need to add it to the $CSV array.
$CSV += $newRecord

# Once we are finished populating the $CSV array with records, we need to write the array to a .csv file using the Export-CSV CMDLET
$CSV | Export-CSV “PersonalInfo.csv” –NoTypeInformation

The above code should generate a CSV file with the column headers and one populated row. More records can be added by using a loop to create a new $newRecord after the last one has been written to the $CSV array.


“That’s great!” you say, “but what do I do with the CSV? It’s just a bunch of text in a file.”

You would be correct, but the beauty lies in how you can modify that data, or use it as a seed to apply mass changes within SharePoint. Consider the following scenario:

It’s your first day on the job as a SharePoint Administrator for a new company (I’m giving you the benefit of the doubt that you are new, and not the one that has committed the following atrocity), and your first task is to replicate the Production SharePoint Farm and all of its contents into a new Development Farm. “Great! That’s easy enough.” you say. And then you open SQL Management Studio and see this list that never seems to end:


 

Using PowerShell with SharePoint
 


 

Clearly you have some bad Karma to atone for, but PowerShell is one of the best Karma atonement tools an administrator has at their disposal. Normally this would be a nightmare of documenting all of these content databases and what Web App they belong to so they can be re-attached in the development Farm correctly. Here’s what we know we need (more can be added, but this is the bare minimum for this example):
  • Web Application URL
  • Content Database Name (GUID warts and all)
  • Online/Offline status
With the above information we should be able to replicate the content database allocation among the various Web Applications. To gather this information we need turn to PowerShell:

# I always add the snapin so my scripts can be executed directly from the standard PowerShell console
add-pssnapin microsoft.sharepoint.powershell -EA 0

# Get the current script execution path so our CSV file gets saved back there
[string]$curloc = get-location
out-file -filepath "$curloc\WADBs.csv"

# Create the empty CSV array
$arrCSV = @()
# Get an object of all of the Web Applications in the Farm
$webapps = get-spwebapplication

# Loop through each Web App

foreach ($webapp in $webapps)
{   
    # Loop through each database in the Web Application
    foreach ($database in $webapp.contentdatabases)
    {
       # Create the empty record object and start populating it with information about the current Web Application and database
       $objDB = "" | select DBName,DBStatus,WebApplication
       $objDB.DBName = $database.name
       $objDB.DBStatus = $database.status
       $objDB.WebApplication = $webapp.URL

       # Add the new record as a new node in the CSV array
       $arrCSV += $objDB 
    }
    # Now that we’ve added all of the records to the array we need to write the CSV array out to the file system
    $arrCSV | Export-Csv "$curloc\WADBs.csv" -NoTypeInformation
}
Here is what the CSV output file will look like:

Using PowerShell with SharePoint


 

If you needed to, you could do a find and replace in the above CSV file to convert something like http://community.contoso.com to something like http://dev-community.contoso.com in the case that your DEV Farm will be using alternate URLs from your Production Farm, which it should.


Obviously, between here and there, you will need to backup and restore your Production SQL content databases into your Dev SQL instance, but automating that process is a whole other article. Assuming you have completed that task, our next step is to reconnect the content databases back to their Web Applications in the new Development Farm by reading in the CSV file we created earlier.

 

# I always add the snapin so my scripts can be executed directly from the standard PowerShell console
add-pssnapin microsoft.sharepoint.powershell -EA 0

# Get the current script execution path so our CSV file gets saved back there
[string]$curloc = get-location

# Read in the CSV file from the file system into an array
$arrCSV = Import-CSV "$curloc\WADBs.csv

# Loop through each node in the array
foreach ($node in $arrCSV)
{
# use that data in the current node/record to connect the current database to the correct Web Application
mount-spcontentdatabase $node.DBName -webapplication $node.WebApplication


# Now that the database is connected, we need to set its status so it matches the Production Farm

$ContentDB = get-spcontentdatabase $node.DBName

$ContentDB.Status = $node.DBStatus


# Don’t forget to execute the Update() Method when setting properties on the database object

$ContentDB.Update()
}
So there you have it. After some time, depending on how many databases you are working with, the script should complete and you should be able to verify in Central Administration that all of the databases have been attached with the proper status. This is just one example of many for how you can go about extracting information from SharePoint to be used to automate, or at least greatly speed up, mass changes based on that extracted data.

Happy hunting!

-Corey Burke
Corey.Burke@Rackspace.co.uk

@CBurke007


 

For more on using PowerShell with SharePoint, check out this extended tutorial video by Racker and SharePoint MVP, Todd Klindt:

     
  

Powered by SharePoint 2013 © 2016 Rackspace, US Inc.