This post originated from an RSS feed registered with Java Buzz
by Simon Brown.
Original Post: JSIG
Feed Title: Simon Brown's weblog
Feed URL: http://www.simongbrown.com/blog/feed.xml?flavor=rss20&category=java
Feed Description: My thoughts on Java, software development and technology.
First up was a look at their Java profiling application. Enerjy told us that in a study commissioned by Sun Microsystems, 50% of projects didn't even consider using a profiling tool. I was quite surprised by this figure - 50% actually sounds on the low side! Aside from whether profiling is estimated and planned for, typically profiling is only used in a reactive rather than proactive manner. I'm still unsure how profiling can be become a normal part of development, but I suspect that other things need to happen beforehand. For example, continuous performance monitoring still doesn't happen on many projects and without this, it's hard to figure out whether you need to profile your app. Some good advice was to get the non-functional requirements agreed as soon as possible.
The other thing that Enerjy said is that memory leaks are one of the primary causes for poor performance in J2EE applications. While this is true to some degree, I think that there are far bigger contributors, such as incorrect transaction boundaries or too many remote calls. Anyway, back to the profiling tool, it does the usual memory/CPU/thread profiling and has some nice hooks into IDEs such as Eclipse. Seemed to work well although I didn't really understand what this provides over other tools like OptimizeIt, JProbe or JProfiler. Oh, and no support for Mac OS X. ;-)
After lunch (good pizza!) was a look at their code analysis tool. The presentation started with a look at why many software projects fail and the key factor that came out here was poor code quality. Again, this is one contributing factor, but (IMHO) there are other bigger factors such as a lack of process, failing to manage client expectations, failing to manage changing requirements, etc. One good point made was that code reviews are not typically estimated and planned for. I agree with this.
Back to the tool and, again, it seemed to work well and integrate nicely with Eclipse. From a functional perspective, the tool lets you perform automatic reviews to ensure that code is following well known and industry accepted standards. Here, we're talking about things like the Sun coding conventions and some of the best practices highlighted by books such as "Effective Java". So how does it compare to the open source tools like PMD, Checkstyle and even the rules built into Eclipse itself? One example cited was that you can write your own rules as Java classes. Well, you can do that with PMD. I think that rather than waiting for the inevitable question, it may have been better to address this head-on. For me, there's nothing revolutionary here, although both tools will be interesting to keep an eye on. Overall, an interesting session.