Supports: Should work with any version of SharePoint, but this version is solely tested on SharePoint 2010. There’s a also a version of PressurePoint Dragon that’s dedicated to SharePoint 2013 that leverages specific 2013 features at http://gallery.technet.microsoft.com/PressurePoint-Dragon-for-87572ee1

Requires: The presence of the .NET 4.5 framework, because it makes extensive use of Parallel Programming techniques. Supports anonymous and Windows (NTLM) authentication.

About  PressurePoint

When you apply enough pressure, every application you or somebody else builds has a point where it breaks. I call this point the pressure point. I’d say it’s a strong advisory positive to undertake some activities to find out where the pressure point of the application that you’re responsible for lies. Several kinds of tests are commonly used to find out about these:

It helps a lot if such tests are repeated throughout development/test/staging/production environments. This allows you to get a feel for your application. During these tests, you’ll typically look at server response time (instead of rendering time), the time it takes the client to make the request and get the final response back. Because of this, I can advise to execute performance tests as close to the server or server farm as possible to eliminate network latency issues. Most of the times, as an application developer or admin you don’t have much or any control over the network and you’ll be more interested how the specific application holds up.

Also, but this is quite obvious, if you can avoid it don’t place test clients on the server or server farm itself, or on the host hosting the virtual machines containing server or server farms. This can have quite the effect on the test outcome, although I have to say that in my experience the effect is limited enough to be able to undertake meaningful performance tests launched from the server or server farm. Other quick tips: it typically works better if you execute performance tests using multiple client computers and you should preferably execute performance tests using multiple user accounts.

Whatever types of tests you’re planning to do, please remember that forgetting to do any type of performance testing will result in an interesting product release experience. Lately, I can’t keep track anymore of the number of times companies contact me wishing they would have spent some time doing performance testing.

Lots of Tools

There are lots of tools out there that can help you do performance testing, but in my experience (and I have looked at 100+ of these tools) there are two types of tools: tools that are just a preview of a commercial version and too limited to do anything useful without buying the license and then there are tools that are insanely complex to use. See my blog post at http://sharepointdragons.com/2012/12/26/the-great-free-performance-load-and-stress-testing-tools-that-can-be-used-with-sharepoint-verdict/ for more information. The following overview at http://en.wikipedia.org/wiki/Test_tool is also nice and more objective (well, it would be more accurate to say that it refrains from giving any opinion).

So, it depends on your situation how to proceed. If you have budget, you can buy a great performance test tool and use that. I found myself in situations where I had to do performance testing in companies that didn’t have a budget to invest in performance tooling. There was also another issue…

About SharePoint

As I mainly work in SharePoint environments, I prefer to use a tool that is able to do performance testing specifically targeted towards SharePoint. I found none. During my SharePoint testing, uhm, dare I say, adventures, I found that SharePoint page requests are typically handled just fine and it’s hard to get a SharePoint environment to its knees just doing that. Request times tend to increase linearly, which is a good sign for an application. On top, SharePoint handles excessive page requests gracefully, without falling back in throwing all kinds of errors. Things get a lot more interesting and dangerous when you do one of the following things:

When using a testing tool that doesn’t have knowledge about SharePoint, it will be quite hard to test these aspects.

My conclusion

It may come as no surprise that eventually I decided that it was easier to build my own tool that has specific knowledge about SharePoint, can be extended by me at will, and is easy to use. Making extensive use of the .NET parallel programming capabilities, I found it was quite easy to do. When I was done, I decided that I wanted to share the basic version of it (basic, since I build custom extensions in it dedicated to the projects I’m doing) with the community. Later, I’m planning to add a specific version dedicated to SharePoint 2013, but I’m not quite there yet.

What to look for?

Doing performance testing in SharePoint environments without knowing what to look for is not the most useful thing one can do with one’s time. There are specific performance counters you should look out for on SharePoint WFE’s and different ones to check out on the back-end databases server. Depending on your needs, you might also need to spend some time coming up with the right set of performance counters you need for monitoring dedicated application servers. If you want to learn more about this topic, I can definitely recommend my gallery contribution at: http://gallery.technet.microsoft.com/PowerShell-script-for-59cf3f70 I'd also recommend the use of my SharePoint Flavored Weblog Reader (SFWR) tool at http://gallery.technet.microsoft.com/The-SharePoint-Flavored-5b03f323 which helps to analyze IIS log files.

Whether you use these tools or not: bear in mind that running a performance test tool without analyzing what happens on the server is absolutely useless!

How to use the PressurePoint Dragon for SharePoint

PressurePoint is a command line tool that reads an XML file that describes the test you want to execute. Currently, it only supports Windows (NTLM) or anonymous authentication. When you download the PressurePoint ZIP file it contains three things:

Explanation of the structure of a Test Description file

The test description file can do a couple of simple things. It contains a test body that is repeated x times, determined by the repeat attribute of the <Test> element.

XML
Edit|Remove
<?xml version="1.0" encoding="utf-8" ?> 
<Test repeat="10"[body omitted for clarity] 
</Test>
 

The <Test> element is the root element and only occurs once. It contains 1 or more <Session> elements. In a Session, you can specify important configuration info, such as the user name (user attribute), password (password attribute), domain name (domain attribute), the number of concurrent users that start a session (e.g. 20 instances of user A start executing the actions as described in a session) via the concurrentUsers attribute, a friendly name that is outputted to the console window to make it easier to identify which session is executed at a given time (friendlySessionName attribute).

Please note: If you’re using anonymous authentication, the values for user, password, and domain can just be left blank.

The following example shows the Session section:

XML
Edit|Remove
<Session user="administrator" password="verySecret" domain="lc" concurrentUsers="1" friendlySessionName="SessionA"[body omitted for clarity] 
</Session>
 

Then there are various actions that can be used within a Session. These are:

XML
Edit|Remove
<?xml version="1.0" encoding="utf-8" ?> 
<Test repeat="10"  <Session user="administrator" password="superSecret" domain="lc" concurrentUsers="1" friendlySessionName="SessionA"    <Comment>Start Moon session A for administrator</Comment>    <Request>http://moon/pages/default.aspx</Request> 
    <RandomRequest      <URL>http://moon/pages/default.aspx</URL> 
      <RandomDelaySeconds min="1" max="3" />  <URL>http://moon:28827/sitepages/home.aspx</URL> 
    </RandomRequest> 
  </Session> 
</Test>
 

The next example shows how to simulate 1000 concurrent users, using 2 different user accounts in a test that is repated 100 times:

XML
Edit|Remove
<?xml version="1.0" encoding="utf-8" ?> 
<Test repeat="100"  <Session user="administrator" password="secretPwd" domain="test" concurrentUsers="500" friendlySessionName="SessionA"    <Comment>Start session "Home page" for administrator</Comment>    <Request>http://mysrv/sitepages/home.aspx</Request> 
  </Session>  
 
  <Session user="jBlack" password="superSecret" domain="test" concurrentUsers="500" friendlySessionName="SessionA"    <Comment>Start session A for Jack Black</Comment>    <Request>http://mysrv/sitepages/home.aspx</Request> 
  </Session> 
</Test>
 

Quick tips for constructing performance test cases

The following link contains interesting information about the typical type of use of a SharePoint environment: http://office.microsoft.com/en-us/windows-sharepoint-services-it/capacity-planning-for-windows-sharepoint-services-HA001160774.aspx . The quick take away is this:

This will help you build test cases that are more realistic; especially in situations where the customer isn’t really sure how much the application will be used.  Concerning this topic, I’ve also found the following topic to be quite interesting: http://blogs.technet.com/b/wbaer/archive/2007/07/06/requests-per-second-required-for-sharepoint-products-and-technologies.aspx

As a final guideline, I’ve also worked with the following rule of thumb that may help you: in a typical enterprise application, 1% of the users makes a request per second during peak time, in an enterprise application that is used extremely, 3% of the users makes a request per second during peak time.

Support Tools

It can be frustrating to try a new community tool that doesn’t seem to work. It makes you wonder whether you made a mistake in constructing the XML for the test case, or whether the tool simply doesn’t work. I’ve built two tools that support PressurePoint: Ping Dragon for SharePoint 2010 (http://gallery.technet.microsoft.com/Ping-Dragon-for-SharePoint-70fb299e ) and WinPing Dragon for SharePoint 2010 (http://gallery.technet.microsoft.com/WinPing-Dragon-for-eefb6dd3 ). The tools fulfill a single purpose: ping SharePoint using the same method leveraged by PressurePoint. In other words, if these tools work, PressurePoint will work too. The difference between both support tools is that the WinPing Dragon tool hides the password from view, while the Ping Dragon doesn’t.

What’s going on under the covers?

Use the Resource Monitor tool (resmon.exe) to “check the heartbeat” of PressurePoint, since the tool is a bit of a black box to you and watching it doing its work can be a boring experience. Resource Monitor clearly shows how PressurePoint is building up to the point where it can simulate the load you require to simulate the number of different users and sessions you need. PressurePoint executes each session in a separate thread and Resource Monitor will show an increase of the PressurePoint thread counter until it approximates the intended load.

The System image normally, as you’d expect, has the highest number of active threads (a couple of 100s), but once you’re simulating loads of 100s or even 1000s of end users, PressurePoint surpasses this. One of the things that I found interesting was that it can take quite a long time until you get to the point where you can actually run 100s or even 1000s of separate threads in a single application (on the environments I’ve tested it on, it can take 1 hour or more to reach those kinds of numbers). It makes sense, since those are a lot of threads, other threads finish their work, and your system has other tasks to take care of. But still, before building the tool, I didn’t anticipate this.

Finally

I won’t be adding new features to this version, as I’m in the process of creating a new version that’s specific for SharePoint 2013. This version will have more intelligent behavior and once it’s ready, I’ll announce it here. Btw, the Figure below shows the output (which can be piped to a text file via pressurepoint.exe > output.txt).

 

More info? Check the SharePoint Dragons blog at http://sharepointdragons.com/ !