This post originated from an RSS feed registered with .NET Buzz
by Eric Gunnerson.
Original Post: Performance of Generics
Feed Title: Eric Gunnerson's C# Compendium
Feed URL: /msdnerror.htm?aspxerrorpath=/ericgu/Rss.aspx
Feed Description: Eric comments on C#, programming and dotnet in general, and the aerodynamic characteristics of the red-nosed flying squirrel of the Lesser Antilles
It has some good information in it, but I'm not sure that you should spend too much time thinking about the issues that he's talking about. Groups writing libraries inside Microsoft should take performance very seriously - if we provide a great library with poor performance, it doesn't have a lot of utility for customers. The problem that library writers have is that there isn't a small set of performance scenarios - performance matters all over the place, because it's hard to tell what customers are going to do with the library.
But for the majority of applications, this isn't the case. Performance is rarely dominated by small decisions, such as whether you use an ArrayList or a List<int> to store your data. It's usually dominated by algorithmic concerns - how you process data, how much redundant processing you're doing, etc. My experience is that the ability to be able to address these concerns is fairly well correlated with the resiliency of your code to change, which usually comes down to how clean and understandable the code is. If you have clean code and good tests, it is probably feasible to refactor your code to improve performance once you find out where the performance issues are.
Note that I'm not saying that you should put performance off until the end - that rarely works, as time at the end is fairly precious. I think you should focus on macro performance rather than micro performance.