Home » WPF

SSIS Text file Transformation data types


I have a 1000 column tab delimited file that I need to import into SQL 2008.  My issue is that the default data type for the input columns is varchar 50, which fails on datatype conversion when the desitination table is defined with the corrent data types. To get around this I created a dummy text file with all fields maxed out with characters, then ran the auto-detect to get the correct lenghts, then used the staging table with varchar's of the same length, then an SP converting all stage column to correct DT's for the transfer to final table.  I did it this way as I was able to do it all programatically rather than manually step through all the input columns.

This has been working fine, however we have 18 of these files and they will be large, so I wanted to use partition switching in order to get the data from the stage table to the archive however I can't do this when the DT's are different.

So my question is... Can you get SSIS to inherit the DT's for a source text file from the destination table?  Or can you disable (at runtime) validation of datatypes differences?  Or can you tell it to implicitly convert to the destination datatype?  Or is there another solution?



3 Answers Found


Answer 1

By the way I cant use the auto-detect feature on an original file as standard as I cant guarantee all columns are populated, or corrent according to the files spec.  My last remaining option is to create a dummy file using correct DT's for this, but if there is a simpler option please share!

Answer 2

A single file with 1000 columns? Really? And a SQL table to match? WOW

Unfortunately, there is no way to tell the Flat File connection manager to auto-discover based on anything other than the file it is hooked to.

But all is not lost. Maybe you could do this: Create a simple package that goes the other way by creating an OLE DB source that takes the TOP 100 rows, and sends it to a FLAT FILE with Column Headers. Then use that to design the Flat File Connection for importing and use the auto-discover data types. But just realize that the choices it makes may not be 100% accurate so you are still going to have to verify each and every column.

But sometimes SSIS can do a nice implicit conversion if the data types are close enough. So say for example, your Flat File connection has a data type of DT_STR(30) and that is going into a SQL field of VARCHAR(50). Well, SSIS isn't going to care because it will fit. But if it tries DT_WSTR(50) to VARCHAR(50) then you'll have issues.


Answer 3


Thanks for your reply. 

Actually it is a 2209 column file which I am vertically partitioning into two tables!

You suggestion is precisely what I have done to create a dummy file of all character types maxing out the fields, however it is difficult to do it for the varying types as I would have to populate those 100 lines manually or using a data generator, and even then as you say the discovery would not necessarily be accurate, so I would still have to check them all!

Its a case of weighing up that effort against the performance gain of the switch operation... and I'm sure its worth but really am not looking forward to doing it!

I wonder if some sort of "use target DT's" could be added for future versions, I imagine I would use that for nearly every text file import I do, since I define the table from the provided spec before I do and transformations.


how can i convert a record from a flat file to a different data type without using the data conversion task?

Hello everyone,

I've been having an issue with trying to run my SSIS package on a server, and it seems to be failing on the OLE DB Command step.  What we have in our SQL 2005 DB, is a User-Defined Data Type (base type char(7)) and the OLE DB Command is supposed to call a proc that passes in a value of this data type.

ie:  CREATE PROCEDURE myProcedure ( @passedInFromSSIS MY_DATATYPE ) AS ....

In my SSIS package, I have the type defined as DT_STR with a length of 7. 

Now, when I run the package locally (via Visual Studio), the process runs with success.  However, once the package is deployed on a server and run from an application (note: it is run under a different user), the process fails on a validation step with a "Invalid Parameter Number" error.

Now, if I change the input parameter in my proc to the base type of the user-defined data type, the process works again.

Has anybody run into a similar issue or know what may be causing this issue?  I first suspected perhaps I needed to grant permissions on the user-defined data type (since I was able to run it under my security context, but not under the application's), however noticed that there isn't security tied to the types. 

Any other thoughts?  Please let me know if you need further explanation. 



I have a txt file with flat file connection manager inserting data into the sql table. Either the last column has two fields into it or columns does not show up properly. It was working in sql 2000 and not in sql 2008. File looks something like this...


011205010001002451FAGER ELISABETH R 11005847 11111119400010000RU 84 03261801510002A000000MERIDIAN RD 15044GIBSONIA PA RICHLAND 04402824604PINE-RICHLAND - REGION 2 194142113 7770001RICHLAND 000000000000000000000000000000000000000000000001222222222220000000000000000000000000000000

012309010001010244REID MARJORIE M 02420 11111110100010000RU 84 05082701660001A000000MASONIC DR 15143SEWICKLEY PA ALEPPO 04374430267QUAKER VALLEY - REGION 3 000063103 4010001ALEPPO 000000000000000000000000000000000000000000000000000000022220000000000000000000000000000000

071307010001015333CLEVELAND SANDRA L 00101 11111118202020000RF 84 03063901710003A000000BRISTOL SQ 15238PITTSBURGH PA OHARA 04383020426FOX CHAPEL AREA-REGION 1 000052 19 7250202OHARA 000000000000000000000000000000000000000000000000000002112210000000000000000000000000000000

Please get back to me as soon as possible since this is on priority.....



I have a text File with following format  (  HED - Header Row, CL-Data (Sample 3 Rows) , TRL-Trailer Row)


CL 012352908607309000014090 2010-03-04 16:12:53.077000000 Inserted / Log ICode Action on Order Number / 33949591 / with / I-Code / IDEC/ / Memo: / cci questions about alert/

CL 012352958107309000020378 2010-04-14 10:20:03.340000000 Inserted / Log ICode Action on Order Number / 36992594 / with / I-Code / ISRB,INEW,ILKD/ / Memo: / cci to cxl, saved with benefits and gave her instructions to move account online/

CL 012353013807309000021681 2009-11-24 17:09:05.973000000 Inserted / Log Customer Info Change Action CL 012353013807309000021681 2009-11-24 17:09:05.973000000 Updated / Log Customer Info Change Action

TRL201008240001PCMCL 0000007349

I have to Import The above Mentioned data in to Following Table

CREATE TABLE [dbo].[Log]

[Type] [varchar](3) NULL,
[Number] [varchar](14) NULL,
[Customer_ID] [varchar](10) NULL,
[CallDateTime] [varchar](30) NULL,
[Info] [varchar](5000) NULL

If  you pay Attention to data It has Fixed width and as well as  '\' After  Inserted

 I need data mapped just like I mentioned in below Example

For an Instance:   Data Should Map like this 

Type ( Column Width is 3)   -   CL

Number ( Column Width is 14)  - 01235290860730

Customer_ID ( Column Width is 10)  - 9000014090

CallDateTime  ( Column Width is 30)  -   2010-03-04 16:12:53.077000000

Info (Column Width is 5000/MAX) -   Inserted / Log ICode Action on Order Number / 33949591 / with / I-Code / IDEC/ / Memo: / cci questions about alert/

When I tried to use Fixed Width. Data is Sitting appropriately in their respective columns Except the Info Column data.

What Should I do to make the above formatted text file data to load in to Table?  How to convert from unformatted to formatted text file?

Please Help Me with your suggestions Or With a Solution To fix this issue.  Thank you 

I have a raw data in a flat file, I have to import the file into a temporary table single column but multiple rows later i us a stored procedure which actaully take cares of the mapping before loading into the final destination table. I was wondering is there a way I can avoid creating the temp table( to save disk space and db perfoance),I have been asked not touch the databse creating the temp tables, the only time I can use the Database is when I load the data into the destination table. I was wondering if Cache Transform in SSIS can take care of this or is there any other mechanism in SSIS

There is a bug in SSIS when exporting data from SQL to a comma delimited text file where you specified no text qualifier.

This bug apparently only occurs when you develop the SSIS on a x64 win7 PC and copy the .dtsx file (windows explorer copy/paste) to network path of a x86 SQL server and schedule the job to run from SQL Agent on the same x86 SQL server.

When the SSIS runs, the text file is written out containing text qualifier = "“_x003C_none_x003E".

If you look at "_x003C_none_x003E", it actually means <none>.  x003C = "<" and x003E = ">".


If you go into the SSIS package, double-click in the connection manager section to open the flat file connection manager editor and try to clear the text qualifier removing the <none>, the <none> value get added back in.

The only work-around is to NOT open the flat file connection manager editor, but instead make the change using the property window and clear out any value in the TextQualifier field.


Other similar problems occur when you actually want to put a real value in the text qualifier.  For explain if you select the double-quote as the text qualifier and copy to a x86 server and run, you end up with a text file containing the value "_x0022" around each field instead of a double quote.


In my mind this is a serious bug, I did some research and other people have been having this same issue.


FYI, the SQL server is currently SQL2008 with SP2.  I will try to get it updated to SP3/4 this weekend to see that will help.


FYI2, when I am developing the SSIS using VS2008 and my local test/run is done through visual studio on my Dev PC (x64).  Everything works fine this way.  I do not have a instance of SQL on my machine so I did not test it running from SQL Agent on my PC.



I am pulling text files in gzip format from UNIX system. I want to unzip these files and then import data from these files into database using SSIS.


Before I start, I'm using SQL 2008.

I have a Excel file with email addresses that need to acts at input parameters to a Lookup transformation. I have set the Excel Source to my file and specified the email field to be the output. I have dropped the Lookup Transformation Data Flow and connected the both. I'm going to execute a very simple stored procedure, and under the Connections section my SQL query looks like follows: EXEC Test_GetUserName ? 
When I run that I get an error saying that no parameter was provided. But when I run EXEC Test_GetUserName 'someemail@companyname.com' everything executes great, for the obvious part that the email is hard coded. How do I pass the excel input as the parameter?

Thanks for all the help.


Hello All! Not sure what is going on here but basically I am updating a table in my database via a web service. The table or business object contains about three datetime fields. On the database, the data type for the date and time fields are datetime. In my app, or in my business object, the data type for those date and time fields are DateTime. Now, why or how did one end get converted into a datetime2?

Does Silverlight use datetime2 by default or something?

For more info on datetime2 http://technet.microsoft.com/en-us/library/bb677335.aspx



I wish to use a SSIS Xml Source Data Flow Source (tried both in 2005/2008) to import an xml file and its XSD schema. When I do I get the following error when I click on the columns property:

"Error at Data Flow Task [XML Source [1]]: The XML Source Adapter was unable to process the XML data.
The content of an element can not be declared as anyType

The snippet of the XSD that contains this definition is the following:

 <xs:element name="custom" type="xs:anyType" minOccurs="0"> <xs:annotation> <xs:documentation>Extensibility point that can contain arbitary content. Expected to contain additional elements and attributes for data not captured within this schema</xs:documentation> </xs:annotation> </xs:element>

The XSD is not owned by my department and having to change it would take some bureaucracy.

Are there any suggestions to get around this problem without having to modify that schema in anyway?


Thank you very much in advanced.



I am importing an xml file to sql server DB thru SSIS. In this process, in the data flow, validating the xml against xsd. It working fines and re-directs the error out put to flat file destination when the column datatype is of string in xsd. But in case of integer data type columns, when there are any validation errors, the redirect row option is not working. Package fails at the validation step itself. When I check the execution results, the validation error are displayed there. for validating the xml file xml source is used.

Please help me to validate the integer datatypes and re-direct them in case of validation errors.




I am getting the following error in SQL Server 2008 R2 (build 10.50.1734) x86 platform BI Development Studio (BIDS) 9.0.30729.4462 with .NET Framework 3.5 SP1 in a Integration Services project Data Source View table query defined under the Generic Query Builder when I attempt to run it:

An error occurred while reading data from the query result set. Unable to cast COM object of type 'System._ComObject' to interface type 'IRowset'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{0C733A7C-2A1C-11CE-ADES-00AA0044773D}' failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE)).

This happens on a Windows 7 as well as a Windows XP workstation x86 platforms. The Data Source View is bound to a Data Source connected to a SQL Server 2005 SP3 (build 9.0.4053) server. I do not get this error with the VDT Query Builder. Does anyone have any ideas?




Hi All! I have an issue I've been trying to fix but can't seem to figure it out. I was hoping a kind person would point me in the right direction. :o)

I have an SSIS package that uses an excel connection manager source, and I want to run this package through a job scheduled in the SQL server agent. The data types for the excel file fields are 2 (DT_WSTR) and 5 (DT_R8).

When I run the package directly through the SSIS package (VS solution) all of the data fields are properly imported into the database table. But...when I run this package through the SQL server agent job, ONLY the string (DT_WSTR) fields in each row are being imported, all of the float fields are imported as NULL.

I set the data types for these float fields as "float" in the SQL server import table (data type). Even though the excel source float fields are indicating a type of DT_R8 in the excel connection manager and I set the data types in the SQL server table to "float", I also used the data conversion component and set the type to "float" as a fail-safe.

I guess I should add to that the data access mode in the excel connection manager is using a custom code to select only those columns that I needed and to trim rows that I didn't need. Here's my code that I have in the excel source editor:

select f1, f2, f3, f5, f6, f7, f8
from [mdo$]
where f2 <> 'Rep Name' and f2 is not null and f2 <> 'Sum:'

Any help would be greatly appreciated!


I want to create a multidimensional array dynamically in a Script Component and assign to a Object DataType and then use that variable in another Script Component, to retrieve the elements of the multidimensional array


I am using SSIS 2008, I set a varaible @t1 to data type CHAR and set the value to Y next I used an Execute SQL Task with an ADO.NET connection to SQL 2008 R2 database to insert this the value of @t1 into a test table with only 1 column of data type NCHAR(1):

insert into test1

On the Parameter Mapping section of this task I put the Data Type as String and initially set the Parameter size to -1  

The only problem is it tries to insert a 2 character numeric value into the SQL table. If I set the value of @t1 to N the parameter mapping seems to convert this to 78, when I set the value of @t1 to Y the parameter mapping converts this to 89 it looks like it converts the character to an ASCII code.

Is this meant to be the intended behaviour, as all I wanted to do was insert the actual character into the destination column and not the ASCII code?



hi i have a "gender" column in my database in which o/1 value exists.i want to change 0 to male and 1 to female in SSIS.
2) i also have DOB table in which dates are in dd-mmm-yy format.i want to chechk some of them are erronous like 31-feb-90 ,22-apl-88,21/03/67.
I want to know method how to do remove these erronous data? how to convert it?
how to send dirty data to "error table" ???
need quick response plzzzzz

Hello there

I have a problem with SSIS String to Date conversion, i did refer lot of similar questions in forums but somehow it is not working for me.

My input data is in a Sql server database of type Varchar (10)

Date value is stored in the following format in the staging table - 08/06/2006 ( mm/dd/yyyy)

My destination is a datetime datatype column

I created a derived column and apply this expression

(DT_DATE)(ISNULL(AuditedDate) ? NULL(DT_DATE) : (DT_DATE)(SUBSTRING(AuditedDate,7,4) + "//" + SUBSTRING(AuditedDate,1,2) + "//" + SUBSTRING(AuditedDate,3,2)))

I keep getting this error.

Error: 0xC0049064 at Data Flow Task, Derived Column [574]: An error occurred while attempting to perform a type cast.

Can anyone please help.





Dear Techies

I am having an SSIS package which writes several text files on to a shared drive. Some of these text files are transformations from Tables, some of them from complex views and a few of them are generated using script tasks by executing stored procedures. Recently I have faced an issue with this package in our Development region. The issue does not occur in any other regions. The issue is explained below

The log file shows this

DataFlow: 2010-07-01 10:37:49.98
   Component "InventoryReport" (2082) will receive 9979 rows on input "Flat File Destination Input" (2083)
End DataFlow

But the report file generated was 0kb. This is occuring for a few other reports in the same package. When I run the SSIS package from Visual Studio I am not facing this issue. This issue occurs when the SSIS package is executed using DTEXEC utiltiy. Our system is setup in such a way that a Command file (.cmd file) calls DTEXEC and kicks of the package. The job completes scucesfully but some of the reports are incomplete. Does any one have any thoughts on this issue? Please help.


Thank You


Jayaraj Sivadasan



Hi there,

This is the deal here, I'm trying to select an entire table from a SQL task (straight select) and then dump that result set to a text file on the fly.

Does anybody have done that in past?









In order to convert the packed fields, Microsoft has released Unpack Decimal transformation but it supports only SSIS 2005 version. I don't see a version that can support the SSIS 2008.

Please let me know, as to how to handle these packed feilds using SSIS 2008.




<< Previous      Next >>

Microsoft   |   Windows   |   Visual Studio   |   Sharepoint   |   Azure