3

We have around 5000 tables in Oracle and the same 5000 tables exist in SQL server. Each table's columns vary frequently but at any point in time source and destination columns will always be the same. Creating 5000 Data flow tasks is a big pain. Further there's a need to map every time a table definition changes, such as when a column is added or removed.

Tried the SSMA (SQL Server Migration Assistance for Oracle ) but it is very slow for transferring huge amount of data then moved to SSIS

I have followed the below approach in SSIS:

  1. I have created a staging table where it will have a table name, source query (oracle), Target Query (SQL server) used that table in Execute SQL task and stored the result set as the full result set

  2. created for each loop container off that execute SQL task result set and with the object and 3 variables table name, source query and destination query

  3. In the data flow task source I have chosen OLE DB source for oracle connection and choose data access mode as an SQL command from a variable (passed source query from loop mapping variable)

  4. In the data flow task destination I have chosen OLE DB source for SQL connection and choose data access mode as an SQL command from a variable (passed Target query from loop mapping variable)

And looping it for all the 5000 tables..it is not working can you please guide us how I need to create it for 5000 tables dynamically from oracle to SQL server using SSIS. any sample code/help would be greatly appreciated. Thanks in advance

Trevor Reid
  • 2,533
  • 3
  • 23
  • 33
Vikas Kumar
  • 31
  • 1
  • 4
  • Check the following SO post it may help: https://stackoverflow.com/questions/54781017/how-to-map-input-and-output-columns-dynamically-in-ssis – Yahfoufi Jul 12 '19 at 11:09
  • Thanks for response but I already explored the link..it would be better if you can give me some clear steps for above requirement. – Vikas Kumar Jul 13 '19 at 18:42

1 Answers1

0

Using SSIS, when thinking about dynamic source or destination you have to take into consideration that the only case you can do that is when metadata is well defined at run-time. In your case:

Each table columns vary frequently but at any point of time source destination columns will always same.

You have to think about build packages programatically rather than looping over tables.

Yes, you can use loops in case you can classify tables into groups based on their metadata (columns names, data types ...). Then you can create a package for each group.

If you are familiar with C# you can dynamically import tables without the need of SSIS. You can refer to the following project to learn more about reading from oracle and import to SQL using C#:


I will provide some links that you can refer to for more information about creating packages programatically and dynamic columns mapping:

Hadi
  • 31,125
  • 9
  • 49
  • 111