Loading ...
Sorry, an error occurred while loading the content.

RE: [XP] Database Testing & Simple Code

Expand Messages
  • Bayley, Alistair
    There is also a separate Yahoo group for Agile databases: http://groups.yahoo.com/group/agileDatabases/ Alistair ...
    Message 1 of 2 , Jul 28, 2006
    • 0 Attachment
      There is also a separate Yahoo group for Agile databases:
      http://groups.yahoo.com/group/agileDatabases/

      Alistair

      > -----Original Message-----
      > From: extremeprogramming@yahoogroups.com
      > [mailto:extremeprogramming@yahoogroups.com] On Behalf Of
      > yahoogroups@...
      > Sent: 27 July 2006 15:21
      > To: extremeprogramming@yahoogroups.com
      > Subject: Re: [XP] Database Testing & Simple Code
      >
      > I think I'd start out with the Repository pattern in
      > order to encapsulate the SQL. See Domain Driven
      > Design by Eric Evans.
      >
      > One of the advantages of Repositories is that they
      > can be pluggable, so you can replace them with
      > fast testing versions when desirable, and then
      > switch them to a test data base, or to the
      > production data base when needed.
      >
      > Another advantage is that, since they encapsulate
      > the SQL, and that only in the versions that face the
      > data base, your data base expert has a known place to
      > look for the SQL.
      >
      > John Roth
      >
      >
      > ----- Original Message -----
      > From: "Simon Jones"
      > <simon.at.tcbventures.com@...>
      > To: "extremeprogramming@yahoogroups.com"
      > <extremeprogramming.at.yahoogroups.com@...>
      > Sent: Thursday, July 27, 2006 12:04 AM
      > Subject: [XP] Database Testing & Simple Code
      >
      >
      > Hi,
      >
      > I'm looking for a little or advice or some pearls of wisdom around
      > managing testing data for XP projects which are focussed primarily
      > around databases (Oracle in particular, if thats relevant).
      >
      > Doubled pronged message as I'd also be interested in any thoughts on
      > how to translate XPs idea of simple, clean code into DB development
      > i.e. PL/SQL.
      >
      > I'll start with the latter.
      >
      > For some reason (just lucky I guess) most of the XP work I've been
      > involved with seems to have revolved around an Oracle database.
      > Ideas like refactoring, test-first, continuous integration etc have
      > been readily applicable but one area in which I'm often confronted
      > with a challenge is how to make SQL and PL/SQL 'simple'. I am far
      > from being an expert with Oracle but it seems to me that convenional
      > wisdom says that it is better to have one large SQL statement than
      > many small ones when there is a direct choice. This appears to be
      > primarily driven by performance.
      >
      > XP suggests that optimisation ought to be left until last i.e. only
      > do it when you know you need it. However this doesn't seem to sit
      > well with databases in general as, almost by defintion, they are
      > likely to be the slowest aspect of an application. What I mean here
      > is that whilst you may get away with a non-optimal solution in
      > something like Java, the penalties in Oracle are more severe and
      > more visible. Database that respond directly to human-interfaces
      > (i.e. web based ones) are particularly good at 'advertising' non-
      > optimal code.
      >
      > The problem for me however is that naturally I want maintainable
      > code. Code that can be worked on by the whole team not just the
      > Oracle guru. There often tends to be only one or two Oracle experts
      > and they quickly get spread too thinly.
      >
      > A single SQL statement, can quickly become difficult to read, if not
      > entirely unfathomable. Even though from a SQl perspective it is
      > entirely correct and proper. (currently we have a datamodel based
      > around the GOF composite pattern requiring tree-traversal through
      > CONNECT BY, we are not in a position to make drastic structural
      > changes to the database)
      >
      > Furthermore a single SQL statement is capable of returing a wide
      > array of possible results involving complex joins across many
      > tables. This presents quite a few challenges in terms of unit
      > testing. (maintaining large data sets for one thing). There can be
      > many permustation to a single result set and not checking them
      > thoroughly quickly leads to pain later.
      >
      > This leads me nicely into my second question, how to maintain and
      > organise the (typically) quite large sets of data required to
      > adequately unit test a database.
      >
      > Obviously one needs to set some well known targets to verify your
      > tests. However, if a given pair is to avoid coming up with such
      > large data manually then naturally they want to write a query to
      > extract the data. The problem here is that you almost end up writing
      > the same query to extract the data as the query that will end up in
      > your production code. This isn't therefore much of a test.
      >
      > The problem is further compounded because very often business data
      > changes. The ideal situation is to be working with test data that is
      > a close to production data as possible, but, armed with a barrage of
      > unit tests, continually changing that data can quickly become
      > cumbersome. Often the business also fails to be sufficiently
      > rigorous about how it manages its standing data and production data.
      > Unpleasant surprises abound.
      >
      > The same is true of acceptance tests. We want to be in a position
      > where we can run acceptance tests against data that is relatively
      > close to 'live'. Tricky to do if your data is constantly evolving.
      >
      > Some things that we have tried are:
      > MULTISET functionality (I think this is Oracle specific). This turns
      > out to be quite a good way to break queries down into something more
      > maintainable. i.e. combining sets of results with INTESECT and
      > UNION. But this is only applicable sometimes.
      >
      > Using more de-normalisation than your typical DB designer might be
      > comfortable with. It does however require more 'management' code to
      > ensure de-normalised data is kept in sync. This begins to violate
      > the XP principles of write-once and simple, clean design.
      >
      > Writing automated scripts to move current data to newer production
      > data. It can be done, but once datasets grow large the amount of
      > development effort required to updated tests and 'morph' data can
      > quickly become onerous.
      >
      > All of these help, but still it is far harder to apply XP to SQl
      > code that other language counterparts, or so it seems.
      >
      > In many ways applying XP to databases actually promises the highest
      > gains as traditionally database can quickly become 'legacy' systems
      > through poor maintenance. But because the database has a more
      > fundamental connection to data, so it would seem that the price for
      > applying XP is harder.
      >
      > But as I say, I am not a database expert. Hopefully there are some
      > interesting tips and trick out there.
      >
      > Cheers
      > Simon
      *****************************************************************
      Confidentiality Note: The information contained in this message,
      and any attachments, may contain confidential and/or privileged
      material. It is intended solely for the person(s) or entity to
      which it is addressed. Any review, retransmission, dissemination,
      or taking of any action in reliance upon this information by
      persons or entities other than the intended recipient(s) is
      prohibited. If you received this in error, please contact the
      sender and delete the material from any computer.
      *****************************************************************
    Your message has been successfully submitted and would be delivered to recipients shortly.