Wednesday, January 20, 2010

Performance optimizing with the vs2010 profiler I

Optimizing existing code for performance is one of my favorit tasks. It is a work what gives immediate measurable feedback, always learns you something new, and is in many cases easier then you think, especially if you use the new profiler in vs2010.

Setup
The first thing you need to do is find some way to test that you still deliver the same functionality. Now days this tends to be less of a problem, as many projects tends to have some kind of unit tests (although in many cases it’s more automated Integration tests) . If not it time to write some 
Other ways to accomplish this is to agree/ write some testcases

Establish a base line
Once you have a way to test that you don’t (unknowingly) change the functionality its time to find and reproduce the performance issues. It is not reproducible in a new environment, it's in most cases better to try to find the bottlenecks in the environment before you go for optimizing the code.
After reproducing the performance issues, the next step is to establish a base line. By setting a baseline before you start changing the code you have something to compare your results against. Establishing a baseline is more a procedure and some extra thoughts on how to test and measure your progress. In practical it’s the first performance report you collect from your system saved away

Creating a performance session
Now its time to get started, simply create a new performance session, As before you can choose between sampling and instrumentation.Sampling providing a quick and good enough approach and instrumenting as an more exact and complex approach. I tend to end up doing the instrumenting, although I must admit sampling really is good enough for me, I guess I like the feeling of exactly knowing what’s going on. You can read at msdn: Walkthrough: Profiling Applications

All in one profiling
A new feature in the Vs2010 profiler is that it now collects data about calls to other tires like the database. Before I was most of the time forced to use both the Profiler and SQL server profiler to collect information about database interactions and where execution times. With VS2010 I now can collect basic database interaction information directly in the VS2010 profiler, showing you the queries executed and their execution time. This information is valuable enough to point me in the right direction witch query to look deeper into, although I still can see cases there I will use the SQL Server Profiler.

Better performance reports
One other great improvement from usability perspective is the new graphical presentation of your function, both showing a graphical representation of the time spent in the function, and the code annotated with execution times! The graphical representation of the function calls is also clickable. This makes it really simple and almost joyful to navigate around in your callstack looking at the code and the execution times, giving you a quick idea about what needs and can be improved.
Colin Thomsen has writen 2 post about the new performance reports VS2010: New Profiler Summary Page and VS2010: Investigating a sample profiling report (Function Details)