Finding the best design approach is only part of the solution to writing an Enterprise Application. As with all software, the application must be tested. Unit tests are the first line of defence, but they only the units of the application individually. You also need to test how the units interact with each other and other parts of the application, such as the database. These sorts of tests are called integration tests.
Soon after I discovered unit testing I accidental discovered integration testing. I was writing a system that interacted with a database at just about every stage. So each of my tests would start by creating and populating the database. Then the tests would be run and afterwards the database would be torn down. Creating and destroying the database for each test ensured that it was always in a known state. It also meant that every test took a very long time to run. When I only had a handful of tests this was not too much of a problem. Once I had in-excess of a hundred tests it was a big problem! A general rule is that tests should take close to zero time to run, or people stop running them.
I started to think about how I could improve the performance of my integration tests. So I created a data access layer as is described in my article Data Access Layer Design for Java Enterprise Applications. This enabled my integration tests, which were only masquerading as unit tests, to become real unit tests as they could use a mocked-out data access layer, rather than a real database, there by reducing their run time to almost zero. However, I still needed to write true integration tests to test the data access layer itself. These tests still needed a real database.
Most of the time taken up in setting up and tearing down a database is in its creation and deletion. The actual insertion, update, reading and deletion of data is relativity quick. Most databases support transactions. Transactions prevent a unit of work from being committed to the database until it is completed successfully. If an error occurs part way through the unit of work the transaction can be rolled back and none of the work carried out is committed. If the unit of work is successful it is committed to the database in its entirety once complete.
This can also be applied to testing. At the beginning of an integration test a transaction is started. As necessary, data is inserted into the database and the tests run, which themselves may insert or remove data. Once the test is complete the transaction is rolled back and the database is returned to its original state before the tests. Even if the test fails, the transaction is rolled back and the database returned to its original state.
Once I discovered this I moved the creation of the database structure (the database itself, tables, views, stored procedures, reference data, etc) to my build script and started running my tests in transactions. This greatly improved the performance of my integration tests.
Read more...
Soon after I discovered unit testing I accidental discovered integration testing. I was writing a system that interacted with a database at just about every stage. So each of my tests would start by creating and populating the database. Then the tests would be run and afterwards the database would be torn down. Creating and destroying the database for each test ensured that it was always in a known state. It also meant that every test took a very long time to run. When I only had a handful of tests this was not too much of a problem. Once I had in-excess of a hundred tests it was a big problem! A general rule is that tests should take close to zero time to run, or people stop running them.
I started to think about how I could improve the performance of my integration tests. So I created a data access layer as is described in my article Data Access Layer Design for Java Enterprise Applications. This enabled my integration tests, which were only masquerading as unit tests, to become real unit tests as they could use a mocked-out data access layer, rather than a real database, there by reducing their run time to almost zero. However, I still needed to write true integration tests to test the data access layer itself. These tests still needed a real database.
Most of the time taken up in setting up and tearing down a database is in its creation and deletion. The actual insertion, update, reading and deletion of data is relativity quick. Most databases support transactions. Transactions prevent a unit of work from being committed to the database until it is completed successfully. If an error occurs part way through the unit of work the transaction can be rolled back and none of the work carried out is committed. If the unit of work is successful it is committed to the database in its entirety once complete.
This can also be applied to testing. At the beginning of an integration test a transaction is started. As necessary, data is inserted into the database and the tests run, which themselves may insert or remove data. Once the test is complete the transaction is rolled back and the database is returned to its original state before the tests. Even if the test fails, the transaction is rolled back and the database returned to its original state.
Once I discovered this I moved the creation of the database structure (the database itself, tables, views, stored procedures, reference data, etc) to my build script and started running my tests in transactions. This greatly improved the performance of my integration tests.
Read more...
Comments
Post a Comment