Introduction by Vishwas Lele:

Amazon Web Services (AWS) CTO Werner Vogels offers this great piece of cloud advice: “Treat everything as a programmable resource, including data centers, networks, compute, storage and load balancers.” In other words, automate every aspect of your (cloud-based) infrastructure. There are significant benefits in following Werner Vogels’ advice:

  1. You can build systems that are cost aware by only keeping the parts of the system that are needed and turning off everything else .
  2. Capacity planning is hard. It is much better to dynamically build capacity based on the need.
  3. Failures are not an exception but a rule. Rather than building complex logic to handle exceptions, make your systems fault resilient by provisioning failover resources as needed.
  4. Make your systems more agile – systems that can scale in the direction of business vs. a design time scaling criterion.

Given AIS’ years of experience with SharePoint, we are always looking for ways to make the underlying infrastructure more cost effective, scalable and robust. Fortunately, the aforementioned benefits of automation apply equally to a SharePoint 2013 farm hosted in the cloud — whether it is the ability to dynamically provision a SharePoint 2013 farm on the fly, or the ability to scale up and down based on load, or the ability to make the SharePoint 2013 farm more fault resilient.

But it all begins with developing robust automation scripts to provision and manage a SharePoint 2013 farm. This brings us back to the purpose of this blog post by Abhijit Kumar. Abhijit discusses an automated approach for provisioning a SharePoint 2013 farm using Amazon Web Services. It is noteworthy that the automation approach we describe below is based solely on PowerShell. This might come as a surprise given that AWS offers services like CloudFormation, which enables creation of AWS resources, combined with open source tools such as Opcode Chef and AWS Puppet, which enable the installation and configuration of applications. We chose to rely solely on PowerShell for the following reasons:

  1. PowerShell is Microsoft’s canonical task automation framework, consisting of a command-line shell and a scripting language that has full access to COM and WMI, giving Windows administrators control over every aspect of Windows OS-based machines.
  2. PowerShell scripting language is based on the .NET framework. This means a PowerShell script can take advantage of .NET framework enhancements such as Workflow Foundation (WF). We use WF extensively to manage long-running automation scripts.
  3. AWS Cloud Formation is not available on AWS Gov Cloud. AWS Gov Cloud is an isolated AWS region designed to allow U.S. government agencies and customers with sensitive workloads to address their specific regulatory and compliance requirements. Given that AIS services a large number of customers with stringent regulatory and compliance requirements, we needed an automation approach that worked on AWS Gov Cloud.
  4. If you read our earlier blog post about SharePoint 2013 automation on Windows Azure, you will notice that we have been able to achieve a high level of reuse between Windows Azure and AWS scripts for SharePoint 2013 scripts. While the WF-based provisioning logic is largely the same, Azure Service Management SDK calls are replaced with AWS Tools for Windows PowerShell. This reuse allows us the flexibility to offer our customers a choice between the industry leading IaaS platforms – AWS and Windows Azure.

Abhijth’s post below walks you through the script to deploy SharePoint 2013 Farm on AWS in an automated manner. I am confident that you will it useful. Please give the scripts a try and let us know.

We will be using the following images from the AWS gallery:

  • Image – ami-2c82e345 (Windows Server 2012 Base)
  • Image – ami-2882e341 (Windows Server 2012 with SQL Server Standard)

All of the sample code discussed in this article is available here. (Please note that this is a sample script, provided as-is without any warranties.)


1. Download the SharePoint 2013 image from Microsoft downloads and upload the image to Amazon S3 Storage.

2. Update the Config.xml with the SharePoint key received after the download.

Change the below line in the xml file.

<PIDKEY Value=” [Your Key Value]” />

3. Upload the modified Config.xml  and script files StartUp.ps1 and ADDSForest.ps1 to Amazon S3 storage and modify the URLs in script (in workflow AWS-SP-Farm).

The script will use a Windows Server 2012 base image for installation of SharePoint 2013.  As a default, SharePoint VM is not available. We are going to assume the following SharePoint topology (although you can modify the scripts based on your needs):

  • Windows Server 2012 base image, hosted on a medium VM instance, serving as the Active directory machine.
  • Windows Server 2012 with SQL Server 2012 Standard image, hosted on a medium VM instance, serving as the database server.
  • Two Windows Server 2012 base images, hosted on a medium VM instance, serving as SharePoint Web Front Ends.

We decided to leverage PowerShell workflow, and as a result you need to have PowerShell version 3 installed to run the script.

We choose to use workflow for the following reasons:

  • Ability to resume a workflow from a previous state using checkpoints. (More information here.)
  • Ability to extend the script to run tasks in parallel.

Steps to execute the script are listed below:

1. Download and install the latest AWS PowerShell tools.

2. Download the scripts from Github Repository.

3. Enable delegation of credentials on the local machine where this script will run.

  • Enable-PSRemoting -Force
  • Enable-WSManCredSSP -Role Client –DelegateComputer * –Force
  • Open gpedit.msc and browse to Computer Configuration > Administrative Templates > System > Credentials Delegation. Double-click “Allow delegating fresh credentials with NTLM-only Server Authentication.”

Enable the setting and add the build server to the server list as *                 Double-click “Allow delegating fresh credentials.”

Enable the setting and add the build server to the server list as *

4. Enable TCP for all the ports on the default security group, as shown below

5. Change the AWS Access Key, Secret Key and Key Pair Name values in the script (you will find these settings towards the end of file AWSSPWF.ps1)

6. Execute the script AWSSPWF.ps1.

At a high level, the script is designed to execute the following steps:

  1. Create a Windows Server 2012 VM for Active Directory.
  2. Create a Windows Server 2012 with SQL Server 2012 Standard VM for Database Server.
  3. Create two Windows Server 2012 VMs for Installing SharePoint 2013 as Web Front End.
  4. Retrieve the certificate from the VMs and install it on client computer for executing the commands using WinRM over https.
  5. Ping all the machines to verify they are up and running.
  6. Install Active Directory (in the VM with tag as Domain Controller) and promote it as the DNS.
  7. Add required service accounts and domain users.
  8. Once the above steps are complete, it performs the following tasks in parallel:
    1. SQL Server 2012 standard (1 VM)
      1. Join to the Domain.
      2. Add firewall rule to allow access to SQL Server Port.
      3. Change the service accounts of SQL Server to use domain accounts.
      4. Set the max degree of parallelism to 1 for the SQL server.
    2. SharePoint 2013 (2 VMs)
      1. Increase the C drive space to 100 GB (as default 30 GB is not enough to install SP).
      2. Enable SSP (delegation of credentials).
      3. Join to the Domain.
      4. Add domain user (domain\SPFarm) as local system Admin.
      5. Download the SharePoint 2013 setup files.
      6. Install the required roles and features.
      7. Install SharePoint 2013 Pre-Requisites.
      8. Install SharePoint 2013.
  9. On SharePoint server 1, creates a new farm using configuration scripts.
  10. On SharePoint server 2, runs a script to join the already provisioned farm in previous step.
  11. Installs remaining services and central administration on SharePoint server

Please feel free to use the scripts and modify then to your needs. Let us know if you need any clarifications we will be happy to help you!