Home » WPF

SSIS Text file Transformation data types


I have a 1000 column tab delimited file that I need to import into SQL 2008.  My issue is that the default data type for the input columns is varchar 50, which fails on datatype conversion when the desitination table is defined with the corrent data types. To get around this I created a dummy text file with all fields maxed out with characters, then ran the auto-detect to get the correct lenghts, then used the staging table with varchar's of the same length, then an SP converting all stage column to correct DT's for the transfer to final table.  I did it this way as I was able to do it all programatically rather than manually step through all the input columns.

This has been working fine, however we have 18 of these files and they will be large, so I wanted to use partition switching in order to get the data from the stage table to the archive however I can't do this when the DT's are different.

So my question is... Can you get SSIS to inherit the DT's for a source text file from the destination table?  Or can you disable (at runtime) validation of datatypes differences?  Or can you tell it to implicitly convert to the destination datatype?  Or is there another solution?



3 Answers Found


Answer 1

By the way I cant use the auto-detect feature on an original file as standard as I cant guarantee all columns are populated, or corrent according to the files spec.  My last remaining option is to create a dummy file using correct DT's for this, but if there is a simpler option please share!

Answer 2

A single file with 1000 columns? Really? And a SQL table to match? WOW

Unfortunately, there is no way to tell the Flat File connection manager to auto-discover based on anything other than the file it is hooked to.

But all is not lost. Maybe you could do this: Create a simple package that goes the other way by creating an OLE DB source that takes the TOP 100 rows, and sends it to a FLAT FILE with Column Headers. Then use that to design the Flat File Connection for importing and use the auto-discover data types. But just realize that the choices it makes may not be 100% accurate so you are still going to have to verify each and every column.

But sometimes SSIS can do a nice implicit conversion if the data types are close enough. So say for example, your Flat File connection has a data type of DT_STR(30) and that is going into a SQL field of VARCHAR(50). Well, SSIS isn't going to care because it will fit. But if it tries DT_WSTR(50) to VARCHAR(50) then you'll have issues.


Answer 3


Thanks for your reply. 

Actually it is a 2209 column file which I am vertically partitioning into two tables!

You suggestion is precisely what I have done to create a dummy file of all character types maxing out the fields, however it is difficult to do it for the varying types as I would have to populate those 100 lines manually or using a data generator, and even then as you say the discovery would not necessarily be accurate, so I would still have to check them all!

Its a case of weighing up that effort against the performance gain of the switch operation... and I'm sure its worth but really am not looking forward to doing it!

I wonder if some sort of "use target DT's" could be added for future versions, I imagine I would use that for nearly every text file import I do, since I define the table from the provided spec before I do and transformations.



<< Previous      Next >>

Microsoft   |   Windows   |   Visual Studio   |   Sharepoint   |   Azure