This post originated from an RSS feed registered with Agile Buzz
by Steven E. Newton.
Original Post: When DBAs Go Bad
Feed Title: Crater Moon Buzz
Feed URL: http://www.cmdev.com/buzz/blosxom.cgi?flav=rss
Feed Description: Views and experiences from the software world.
A recent situation I'm aware of appears to demonstrate a specific case of the conflict and problems.
The application involves tracking sums of money, in the form of allocations which have conditions attached specifying what they may be used for. These allocations are of course tracked and persisted. On the other side, money goes out as it is authorized to specific spending instances in accordance with the limitations set for the allocations. Those authorizations are also tracked and persisted. At any given time the allocations, which may or may not be aggregated into larger sums, have a certain balance remaining for their specified purpose.
In theory, it is possible to determine how much money is available for a specific kind of authorization by taking all the allocations in effect and subtracting all the previously handled authorizations over the necessary span of time. No profiling has been done, but it is known there are many allocations and authorizations, with various complex rules, so it appears this calculation is computationally expensive. It also just doesn't make sense to derive these numbers every time they are need.
The developers desire to store these sums back in the database as they are calculated each time, so that the next subtraction needs to only examine the remaining available calculated amount rather than rerunning all the calculations to determined the available balances. The developers cannot convince the database expert to create any sort of rows, tables, or anything to allow these available sums to be stored. The developer in charge of database "stuff" insists that because the available sums can be derived at any time from the allocation and authorization data, then the application must do this calculation every time. In the last conversation there was talk of putting these sums in some sort of scratch table area, not part of the domain schema for the business, but it was not well-recieved.
At this point, it may be that the best available solution, given the resistance from the databae programmer, is for the application to keep the available balances but not ever persist them, meaning that if there is a restart or some other loss of runtime state, then there must be a facility in the code to rerun the calculations forward from the raw data.
This dispute may be over a difference between domain state and application state, in an interpretation of the user requirements. If there is nothing in the user requirements that needs the available sums persisted, is the argument, then it doesn't go in the database. Several questions seem to need investigation. How does the application maintain state over restarts and other sorts of changes? Will maintaining all that in memory result in unaccepatable overhead? How resource-intensive really is the recalculation and can it be done lazily and some intermediate results cached? Is the code for deriving an
1000
d maintaining state unecessaary complexity?