One of the nice features of SAP Data Integrator that DTS just cannot compete with is how it deals with datastores at run time. For an overview let’s think about how the process will normally work…
An ETL developer will typically start out working on a development database, honing and coding to a particular database schema until he feels that it is ready for testing by the QA department. The (probably incomplete) database and ETL package will then be passed over the to the QA department for testing and the developer will likely continue coding and working against his development database.
For good reason the QA team will want to test integrations and data voracity against their own separate database, this is because they will want to test uninterrupted by the developer who will likely break functionality and data on an hourly basis like a heavily caffeinated gorilla with a keyboard. In addition the QA team will want to be able to keep track and debug why an issue raises its head which as we all know would be impossible if the QA and Development teams all used the same database.
The above development cycle is an iterative process, eventually the QA team will batter the development team with their own bugs and together as a team they will arrive at an ETL package which is good enough to release to the client. Again as you would expect the client would not wish to utilise the QA or development database’s and instead would expect their own clean and sacrosanct data which the developers should be kept away from at all costs.
So as you can see the ETL package should be able to deal with multiple legitimate data stores (for both destination and source data) and this is where the Microsoft offering breaks down. SSIS/DTS just cannot cope with this, I have a live and functioning DTS installation that makes use of Live, Test and Development databases. This is achieved by the invoker specifying ‘modes’ which then relate to different database configurations and locations; In my instance I actually store all but the destination data store in the destination database itself which is actually not great for promotion management at all. The point is, Im struggling to do it in a loosely coupled way without having huge unreadable parameter lists which I really didn’t want in this instance.
I recently came to do exactly the same with an SAP data services project, I did exactly as I normally would with DTS… ignored this part of the process totally and just defined my data stores in the normal manner hard coding them to my development database. Later I then came to think about the implementation and promotion strategy, rather than just rushing in and doing as I would normally do I decided to investigate the best methodologies that SAP had to offer; I was pleasantly surprised….