We have around 5000 tables in Oracle and the same 5000 tables exist in SQL server. Each table's columns vary frequently but at any point in time source and destination columns will always be the same. Creating 5000 Data flow tasks is a big pain. Further there's a need to map every time a table definition changes, such as when a column is added or removed.
Tried the SSMA (SQL Server Migration Assistance for Oracle ) but it is very slow for transferring huge amount of data then moved to SSIS
I have followed the below approach in SSIS:
I have created a staging table where it will have a table name, source query (oracle), Target Query (SQL server) used that table in Execute SQL task and stored the result set as the full result set
created for each loop container off that execute SQL task result set and with the object and 3 variables table name, source query and destination query
In the data flow task source I have chosen OLE DB source for oracle connection and choose data access mode as an SQL command from a variable (passed source query from loop mapping variable)
In the data flow task destination I have chosen OLE DB source for SQL connection and choose data access mode as an SQL command from a variable (passed Target query from loop mapping variable)
And looping it for all the 5000 tables..it is not working can you please guide us how I need to create it for 5000 tables dynamically from oracle to SQL server using SSIS. any sample code/help would be greatly appreciated. Thanks in advance