How to achieve test isolation testing Oracle PL/SQL?
Asked Answered
C

4

20

In Java projects, JUnit tests do a setup, test, teardown. Even when mocking out a real db using an in-memory db, you usually rollback the transaction or drop the db from memory and recreate it between each test. This gives you test isolation since one test does not leave artifacts in an environment that could effect the next test. Each test starts out in a known state and cannot bleed over into another one.

Now I've got an Oracle db build that creates 1100 tables and 400K of code - a lot of pl/sql packages. I'd like to not only test the db install (full - create from scratch, partial - upgrade from a previous db, etc) and make sure all the tables, and other objects are in the state I expect after the install, but ALSO run tests on the pl/sql (I'm not sure how I'd do the former exactly - suggestions?).

I'd like this all to run from Jenkins for CI so that development errors are caught via regression testing.

Firstly, I have to use an enterprise version instead of XE because of XE doesn't support java SPs and a dependency on Oracle Web Flow. Even if I eliminate those dependencies, the build typically takes 1.5 hours just to load (full build).

So how do you acheive test isolation in this environment? Use transactions for each test and roll them back? OK, what about those pl/sql procedures that have commits in them?

I thought about backup and recovery to reset the db after each test, or recreate the entire db between each tests (too drastic). Both are impractical since it takes over an hour to install it. Doing so for each test is overkill and insane.

Is there a way to draw a line in the sand in the db schema(s) and then roll it back to that point in time? Sorta like a big 'undo' feature. Something besides expdp/impdp or rman. Perhaps the whole approach is off. Suggestions? How have others done this?

For CI or a small production upgrade window, the whold test suite has to run with in a reasonable time (30 mins would be ideal).

Are there products that might help acheive this 'undo' ability?

Communicate answered 10/6, 2011 at 21:10 Comment(1)
You could snap a clone of the db (way faster than exp/imp or rman), and run your tests on the clone. (ask your sysadm)Trypanosome
C
11

Kevin McCormack published an article on The Server Labs Blog about continuous integration testing for PL/SQL using Maven and Hudson. Check it out. The key ingredient for the testing component is Steven Feuerstein's utPlsql framework, which is an implementation of JUnit's concepts in PL/SQL.

The need to reset our test fixtures is one of the big issues with PL/SQL testing. One thing which helps is to observe good practice and avoid commits in stored procedures: transactional control should be restricted to only the outermost parts of the call stack. For those programs which simply must issue commits (perhaps implicitly because they execute DDL) there is always a test fixture which issues DELETE statements. Handling relational integrity makes those quite tricky to code.

An alternative approach is to use Data Pump. You appear to discard impdp but Oracle also provides PL/SQL API for it, DBMS_DATAPUMP. I suggest it here because it provides the ability to trash any existing data prior to running an import. So we can have an exported data set as our test fixture; to execute a SetUp is a matter of running a Data Pump job. You don't need do do anything in the TearDown, because that tidying up happens at the start of the SetUp.

Chronogram answered 10/6, 2011 at 22:45 Comment(0)
T
9

In Oracle you can use Flashback Technology to restore the serve to a point back in time.

http://download.oracle.com/docs/cd/B28359_01/backup.111/b28270/rcmflash.htm

Tedtedd answered 10/6, 2011 at 21:21 Comment(1)
Flashback Table is a feature which requires an Enterprise Edition license: download.oracle.com/docs/cd/B28359_01/license.111/b28287/… So it is not a universally applicable solution. Also the database and/or the tables must be configured to support Flashback.Chronogram
I
3

1.5 hours seems like a very long time for 1100 tables and 400K of code. I obviously don't know the details of your envrionment, but based on my experience I bet you can shrink that to 5 to 10 minutes. Here are the two main installation script problems I've seen with Oracle:

1. Operations are broken into tiny pieces

The more steps you have the more overhead there will be. For example, you want to consolidate code like this as much as possible:

Replace:

create table x(a number, b number, c number);
alter table x modify a not null;
alter table x modify b not null;
alter table x modify c not null;

With:

create table x(a number not null, b number not null, c number not null);

Replace:

insert into x values (1,2,3);
insert into x values (4,5,6);
insert into x values (7,8,9);

With:

insert into x
select 1,2,3 from dual union all
select 4,5,6 from dual union all
select 7,8,9 from dual;

This is especially true if you run your script and your database in different locations. That tiny network lag starts to matter when you multiply it by 10,000. Every Oracle SQL tool I know of will send one command at a time.

2. Developers have to share a database

This is more of a long-term process solution than a technical fix, but you have to start sometime. Most places that use Oracle only have it installed on a few servers. Then it becomes a scarce resource that must be carefully managed. People fight over it, roles are unclear, and things don't get fixed.

If that's your environment, stop the madness and install Oracle on every laptop right now. Spend a few hundred dollars and give everyone personal edition (which has the same features as Enterprise Edition). Give everyone the tools they need and continous improvment will eventually fix your problems.


Also, for a schema "undo", you may want to look into transportable tablespaces. I've never used it, but supposedly it's a much faster way of installing a system - just copy and paste files instead of importing. Similiarly, perhaps some type of virtualization can help - create a snapshot of the OS and database.

Indreetloire answered 14/6, 2011 at 7:26 Comment(0)
O
0

Although Oracle Flashback is an Enterprise Edition feature the technology it is based on is available in all editions namely Oracle Log Miner:

http://docs.oracle.com/cd/B28359_01/server.111/b28319/logminer.htm#i1016535

I would be interested to know whether anybody has used this to provide test isolation for functional tests i.e. querying v$LOGMNR_CONTENTS to get a list of UNDO statements from a point of time corresponding to the beginning of the test.

The database needs to be in archive mode and in the junit test case a method annotated with

@Startup 

would call DBMS_LOGMNR.START_LOGMNR. The test would run and then in a method annotated with

@Teardown

would be query v$LOGMNR_CONTENTS to find the list of UNDO statements. These would then be executed via JDBC. In fact the querying and execution of the UNDO statements could be extracted into a PLSQL stored procedure. The order that the statements executed would have to be considered.

I think this has the benefit allowing the transaction to commit which is where an awful lot of bugs can creep in i.e. referential integrity, primary key violations etc.

Obsidian answered 9/2, 2013 at 9:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.