This past week a member of my team was able to get a working install and configured instance of Quest Performance Analysis for SQL Server. I’m pretty psyched that we were quickly able to get a working configuration up and running. This should help us solve some challenges with studying historical database performance metrics. We have moved nearly all of our performance testing over to our team in India. The problem that we run into is that it’s very difficult to see what happened to the system or to a system component during the benchmark. While the database (both SQL Server and Oracle) contain dynamic views to peak inside to understand performance issues, it’s practically impossible for us to capture this data. Why exactly? Well, the main reason is that we shutdown and restore the database between every test.
Quest Performance Analysis for SQL Server gives us an historical view into the database. Once we get the agents installed, we will soon have the ability to study a single instance historical performance over the course of a benchmark, as well as compare multiple instances. What’s even more impressive is that we have the same tool for Oracle. The folks at Quest feel that it’s better on the Oracle side then the SQL Server side.
The wonderful folks at Microsoft have developed a similar tool for SQL Server 2008 as part of the Performance Warehouse suite. They have a beta version of this tool available on the Open Source market called DMVStats 1.0. Click here for full details1 on the tool. From my limited experience with both tools, it appears that they have similar functionality, can compare multiple systems, and is easy to configure. DMVStats appears to offer more native performance measurement, similar to Oracle’s Statspack or AWR. The tool provides very clear visualizations that users can interact with.