I’ve designed a model consisting of a Fact table which is supposed to process millions of records in the future. I’m not if Astera Data Warehouse Builder can handle and process this load? As per my knowledge, I would expect this to be a prolonged and resource-intensive process as the data will first be read from the source tables, transformed (retrieving all the applicable surrogate keys from the slowly changing dimensions), and then loaded onto the fact table.
Consider the source database is on SQL Server, and the data warehouse database is on Oracle. This is another thing I am more worried about. In my case, I’ll be having thousands of records moving through ETL one by one from SQL Server to Oracle, retrieving the surrogate keys based on slowly changing dimensions before the final data is loaded into the Fact table.
If this is the case, do you have any optimizations in place?