Home » SQL Server

How capture bad records from "Flat File Source" in SSIS

The flat file is having 10 columns so used 9 delimitor. Few records in the Flat file is having more than 10 columns. I need to redirect in to some error table and rest valid records needs to be loaded in target. Pls help.


4 Answers Found


Answer 1

This is a difficult scenario. You may be able to do it by dragging the red arrow from the Source Adapter on your data flow and sending to some task(s) that will handle the errors. You must also set the "Error Configuration" to Re-direct rows.

However, the hard part is that this MAY not work. You may need to go with a Script Transform set as a Source. In there, you would read every row via the script, and decide what to do with it. The advantage here is that it give you much finer control.


Answer 2

This could go a few ways depending on the flat  file itself and the format.  Agree that a script source  may be the best choice and then bring the rows in and either RegEx them or validate as you go.  That and most processing for these types of things will be slow (row-by-row processing)

Example: If the extra column in the data rows is something simply another catch of a comma, you could bring in the flat file  source as normal.  using a script tranformation you could then process the input row with a RegEx match on the alst column.  Mostly, the last column will contain the added rows as it will go off the meta that is generated and only see your number of header columns  it knows of.  Anything after that will be placed into that last column.  So in that last column you could do a .Match and count the instances of the commas.  If you find more than one, you can act upon that.  Sending the output to an output buffer so it can be later processed or reviewed.

Really varies on how you would do this depending on the flat file format.  Processing the file before you get into the DF is also common if the flat file is truly unstable or you cannot trust the contents while keeping the package from doing the job.


Answer 3

Thanks for your quick response. It will be a tough for me for writing script as I am new to SSIS.

Can you share you thought like If I use conditional split to check the deliminator in each record(if any) i.e. Bad record and redirect  to a ERROR table.

Many Thanks,



Answer 4

Well here's the thing, your Header rows probably have VERY different metadata than your detail rows, so you would need to have just ONE column in the flat  file connection manager, then AFTER you split the data into Header/Detail, you would then parse out all the columns  using conditional splits.

Why not jump in with two feet and learn how to use the Script task? We'll help  (there's plenty of it in this forum).

Try this:

Add a file  Connection Manager to your package (NOT a FLAT file, just a regular file) Add a Data Flow, and in that add a Script Transform, set it to Source Open it and on the Inputs and Outputs tab, add three outputs, one each for Header, Detail, and Footer paths, and add all the columns associated with each. On the Connection Managers tab, add the File Connection Manager from Step 1. On the Script Tab, click Edit Script Under the Public Overrides Sub CreateNewOutputRows() function: take a look at the following code snippets for examples of what to do:


Dim cnInvoiceFile As IDTSConnectionManager100 = Me.Connections.InvoiceFile


Dim fn AsString = CType(cnInvoiceFile.AcquireConnection(Nothing), String)


Dim readS AsNew StreamReader(fn)


Dim dataLine AsString





      If readS IsNotNothingThen


         WhileNot readS.EndOfStream

            dataLine = readS.ReadLine()

With OutputHeaderBuffer


.MyField1 = <something>

.MyField1 = <something else>







Catch ex As Exception



You will need to put in your own logic to test if the row is a Header, Detail, or Footer and write to the appropriate OutputBuffer accordingly.



hello i'm new with dealing with SSIS...when running my package in BIDS the package run successfully but when i tried to run it fromSQL Server Agent job step, the SSIS package failed with the following error:

" Executed as user: xxx. Microsoft (R) SQL Server Execute Package Utility  Version 10.50.1600.1 for 64-bit  Copyright (C) Microsoft Corporation 2010. All rights reserved.    Started:  7:38:18 AM  Error: 2010-08-16 07:38:27.98     Code: 0xC0202009     Source: x_Package Connection manager "x_DB"     Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E4D.  An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E4D  Description: "Login failed for user 'sa'.".  End Error  Error: 2010-08-16 07:38:27.98     Code: 0xC020801C     Source: Save Data to DB OLE DB Destination [862]     Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "x_DB" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.  End Error  Error: 2010-08-16 07:38:27.99     Code: 0xC0047017     Source: Save Data to DB SSIS.Pipeline     Description: component "OLE DB Destination" (862) failed validation and returned error code 0xC020801C.  End Error  Error: 2010-08-16 07:38:27.99     Code: 0xC004700C     Source: Save Data to DB SSIS.Pipeline     Description: One or more component failed validation.  End Error  Error: 2010-08-16 07:38:28.00     Code: 0xC0024107     Source: Save Data to DB      Description: There were errors during task validation.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  7:38:18 AM  Finished: 7:38:28 AM  Elapsed:  9.61 seconds.  Process Exit Code 1.  The step failed."

i'm using SQL2008 and VS2008 to design my packages on a windows server 2008 sp2 (enterprise edition)machine.

any help would be appreciated...thx




I am using flat file connection managers in my packages and am using variables to update the Connection string in the connection managers at run time.These variables are being calculated during run time using the package configurations and script task.The variables are mapped to the connection strings using expressions.Currently in the packages there are some static values assigned to the variables.The DelayValidation property in the package is set to True.When the packages are run the execution completes,all the tasks in the package are executed but the package logs an error message


The file name "\\tbsdc1-s-2345\" specified in the connection was not valid.

Due to this the max error count of the package becomes one and the package fails.This is happening in an environment where the account executing the job doesnot have access to the folder specified in the static values in the variable.I have executed for a couple of scenarios :-

    1.    A static network path value is present in the variable("\\tbsdc1-s-2345\) which is used to populate the connection string.The package reports failure with above issue.This happens for both flat file sources and Bulk Insert tasks.The error message is logged but during execution the in the variable is correct as the data is extracted from the correct file,all tasks in the package are executed successfully.

    2.    No Value is present in the variable which is used to populate the connection string.The packages which use flat file source to extract data are successfull,they donot report the error message. Packages having bulk insert tasks fail having error messages,the package fails at the bulk insert task following tasks are not executed :

System exception: The specified connection "FF_Connection" is either not valid, or points to an invalid object. To continue, specify a valid connection.

    3.    A static string value is present in the variable(Ex:- ABCD) which is used to populate the connection string.The packages having flat file connections or bulk inserts both succeed no error message is logged.


In case of the above scenarios 1 and 3 all tasks are executed successfully.We have executed these scenarios from the designer in the dev env. after making sure that the account being used to execute the packages didnot have access to the server specified as the static value in the variable.

If the account is given access to the server which is present in the static value of the variable the package executes successfully without any error messages.


Is there any other property in the package which can have an impact and cause connection managers to be validated even though the DelayValidation property is set to True and Protection Level is set to "DontSaveSensitive".Currently i have fixed the issue using scenario 3 but am still not sure of the root cause for this.Can somebody throw some light on this.



I am extracting data from flat files.The data is validated using a few business rules.All invalid rows are diverted to an Invalid file.The path of the file is configurable(package configurations).The path is retrieved in  master package using SQL server configurations.Then the values are passed onto the child package as parent package variables.

There are numerous child packages which are called by a common masterpackage.Most times the execution of the child packages is successfull.There are a few occasions where the package fails with the error "System exception: component "FF_DST_Invalid" (286) failed the pre-execute phase and returned error code 0xC0202010.".

This message is generated when no file name is provided in the connection string.But when the package is re-executed without any changes it succeeds without any errors.This has happened for two packages (PackA and PackB).

I checked the packages and found that the names of the variables which are mapped to the connection managers in its connection string property is the same(VarA).The ID of the variables is different in both packages.There are scenarios where the packages will run in parallel on the same ssis box.Can there be an issue when two packages have the same variable name and are executing parallelly on the same SSIS environment.



Hi All,

I have a package which loads data from a flat file source to an OLEDB destination, which is executed successfully and data is loaded perfectly.

But later when the data in the flat file source is modified i.e. if an extra column gets added to the text file, the package is throwing an error because it is unable to detect the extra added column.

How can i avoid this error??? I need my package to execute successfully ignoring the changes(added columns) happened in the flat file source.

Please provide me wth your suggestions and solutions....

Thanks in Advance!!


I have a text file that has mulitple records lengths that I need to load into a table.  The file has multiple record types and each record type detemines the length of the record.  Let me try to explain...the file has a balance line 1, multiple detail records, balance line 2, and a trailer record.

The first record is a balance line 1, which has a record length of 144.

Detail records with a record type of inv or crd, will have a record length of 147.  The remaining record types will have a length of 144.

The second to the last record is balance line 2, which has a a record length of 156.

And the trailer record, which has a record length of 162.

Here is a portion of my source file.  I have made each line bold where you can see the rec type.  The record starting with 9999 is the trailer.

077700001BL100000002010100120101008BALANCE FORWARD 0000000000000000001081175D0000000000{0000000000{0000000000{000001081175D00 0000000000{

077701000INV01953172010100120101008GROCERY - DRY 0000000000000000000073504I0000016678K0000000852G0000000000{000000057679D011N 0000093201E010

07779008099000000002010100820101008KC CASH REC. 0000000000000000001081175M0000000000{0000000000{0000000000{000001081175M00 0000000000{

077790090BL200000002010100820101008PAY THIS AMOUNT 0000000000000000001376839B0000000000{0000000000{0000000000{000001376839B00 0000000000{

999900170 00000000000000000000000 0000000000000000000000000{0000000000{0000000000{0000000000{000000000000{00 0000000000{ 20101008113043

Can someone please tell me how I can load this file into a table so we can report from it.  I tried doing just a regular read source text file as fixed width and load, but with the different record lengths, it causing the longer records to wrap into the shorter records.  

We will be starting to get files regularly next week, I have about a week to get something thrown together.   I'm fairly new to SSIS, any help would be greatly appreciated.


I have a flat-file that has four mandatory records and one optional record. I've made a schema that will validate if all five rows are present, but not if the optional record is omitted.

After reading some other answers here, I've tried setting Parser Optimization to Complexity and also played around with different values to Group Order Type on different levels of the schema, but I can't get it to validate.

The error I get is "error BEC2004: Unexpected end of stream while looking for:
The current definition being parsed is Root. The stream offset where the error occured is 193. The line number where the error occured is 5. The column where the error occured is 0."

Example four-record file:

17.09.2007    20:38:58,10    XXX        PRODUCT     1700    18.09.07
17.09.2007    20:38:59,20    XXX        BRUTTO      3380
17.09.2007    20:39:00,20    XXX        NETTO       850
17.09.2007    20:39:01,20    XXX        SPEED       25000

Example with optional fifth record:

17.09.2007    20:49:49,10    XXX        PRODUCT     1700    18.09.07
17.09.2007    20:49:50,20    XXX        BRUTTO      7680
17.09.2007    20:49:51,20    XXX        NETTO       4840
17.09.2007    20:49:52,20    XXX        SPEED       0
17.09.2007    20:49:53,10    XXX        STATUS    COMPLETED

(I tried posting my schema but ran into a 60000 character post limit :) )


Hi All,

My source and destinations are Sqlserver.I need to create nearly 40 packages with different table .Some time we will get bad records from source. How to capture the bad records? If I use redirect option in oledb destination  I need to create for every table one more error table also is it good or not How to handle this situation?






Our SSIS 2008 developers recently moved from Windows XP 32 bit PC's to Windows 7 64 bit PC's.   Our test and production execution environments are Windows Server 2008 32 bit servers.  We have existing jobs that write rejected rows from FACT loads to a flat files, these flat files are then emailed to business users so they can clean up data in the source systems.  We ahve modified a few of these jobs since the developers have move to Windows 7 64bit for development.  When we run the jobs from BIDS on the Windows 7 64bit machines the flat files look as we would expect.  When migrate them to test or production and run them the flat file is in the format below.  I have read some articles that states the <none> in the Text Qualifier property of the flat file definition is causing the problem, but  I don't think that is true.  I have removed the <none> and se the text quailfier to '  or "  and there is still garbage in the output file.  If I use a single quote as the Text Qulaifier the string _x0027_EmployeeFirstName_x0027_ is put in the flat file.

Thanks in advanced for any suggestions. 

Run from BIDS on Windows 7 64 bit

EmployeeFirstName,EmployeeLastName,ActivityDate,FacilityID,Facility Name

Run on Windows 2008 32 bit

_x003C_none_x003E_EmployeeFirstName_x003C_none_x003E_,_x003C_none_x003E_EmployeeLastName_x003C_none_x003E_,_x003C_none_x003E_ActivityDate_x003C_none_x003E_,_x003C_none_x003E_FacilityID_x003C_none_x003E_,_x003C_none_x003E_Facility Name_x003C_none_x003E_


I'm trying to copy a file to a desitnation in an SSIS package that requires a username and password in order to connect. I am doing this in VB Script task and have tried this:


Dim ServerShare, UserName, Password AsString
Dim NetworkObject, FSO AsObject
ServerShare = \\someserver\sharename$
UserName = "AD\Username"
Password = "itspassword"
NetworkObject = CreateObject("WScript.Network")
FSO = CreateObject(
"", ServerShare, False, UserName, Password)

The last line is not recognized...
Is there another way or am I missing a reference for the top which I can't seem to find anything to add that's close...
Currently I have this at the top:

Imports System.IO
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime

Help needed immediately... (of course!!!)





Hi All, I need one help in SSIS.

I have a source file with column1, I want to split the column1 data into
multiple columns when there is a semicolon(';') and there is no specific
length between each semicolon,let say..


And at destination we have 4 columns let say D1,D2,D3,D4
I want to map
John -> D1

Please I need it ASAP

Thanks in Advance,



Is there a mechanism in SSIS flat file connection manager where I can pass the type of column delimeter as variable. I know that standard ones are comma, tab, verticalbar etc. But rather selecting them manually can I pass it as vraible to the package

Hi Guys,

I have a flat file that contains data that need to insert in a SQL table. The problem lies on the flat files itself, i need to find a way on how to format it to the desire format using SSIS.

Here are the sample data that i have in my flat file.


I want to format it like this.


20071016, 444,  BOXING , GLOVE , ITEM

Thus anybody knows on how to do this using SSIS?



Hi i am using SSIS for generating Flat file from OLE DB source, and create job which run weekly,

which is working , now i want to set some thing , like every week my file create with diffent name like ABCTimestamp.

is there way to do in job or SSIS, which generate every week with filenamewithtimestamp.txt


Thanks and Regards



For tracking purposes, I need to insert the name of the flat file from which data is imported into a column called sourceFile. How can I grab the connection string and get that inserted into a column in the database? (use the Derived Column transformation?)






Here is my problem :
I work on a SSIS package with SQL SERVER 2005
I need to extract data from a table and put these data in csv files

But... the flat files name should be dynamic and assigned by a variable ...

Here's an example of my table :

Column header :
Id, Name, Number
1   TOM     22
2   TOTO   44
3   SAM     44
4   RADIO   55

I expect to have 3 csv files :

for example : USER_44.csv contains :

if there's 50 different number, i expect 50 files

can i do that in a dataflow ?

thanks for answering

I have a flat text file (comma delimited) that is essentially multiple files, each with its own format, combined into one file. The file is coming from an external software vendor so unfortunately we don't have much choice but to work with what we are receiving.

Here is an example of what the file could look like:

Customer Data

Customer Plan

Customer Payment

There is an empty line between each 'section' of data. I adapted a VB script I found online that can take the incoming file and save off each section as its own file so that each one can be separately imported, but this seems inefficient. I'm really new to SSIS in general, but it seems like it shouldn't be that difficult to take the data, split it where there is an empty line, and then import each section into the appropriate SQL table.

Any ideas would be most welcome. Thanks!



I am having a random error with my package(trying to generate fixed length text file using Ragged Right from one sql server table).

Data looks like for most of the runs(good data):


However, maybe for 1 out of 10-20 runs, some rows appear to be deleted misteriously(log shows everything fine and I am sure there is no other procoess update the file). This can happen anywhere(but just only at 1 place ). SSIS didn't provide any error or warning, it even says it inserted correct number of rows in the log.

Sometimes data generated like this:


DTLA                                                                      000033.5200C

Have anyone seen this kind of error before. I have tried everything, e.g. cancancate all columns into one string, but still couldn't fix it.

Any information will be appreciated!







I have 10 text files which are on average of 6  GB in size. Out of those 10, 4 files average up to 10 GB in size.

We have 10 SSIS packages which loads those text files onto 2 staging tables. Each 5 of them loads 1 table.

Inside the package, its only a flat file source, data conversion for all columns from source, and write into the destination table.

And this process takes around 5 hours to complete. the size of 2 tables after the load completes is 60 M and 40 M each.


The other thing which I have observed is all the packages are scheduled through one schedule and the schedule  have 10 diff sql server jobs( for 10 packages). Does it have an impact on the processing time ,as all these jobs will be triggered at the same time and execute the 10 packages. Does that create locks on the table when 1st package tries to insert causing the whole process to take 5 hrs.


Any suggestions will be really helpful here.


I created a SSIS to export to a flat file (from a SQL command : a stored proc).

I don't wan't my SSIS to create an empty file if there is no data.

How can I achieve this ?





I have a corrupt Flat File and I need to write a SSIS Script Component to Replace LF to CR LF in my flat file.

Any idea how to do it?



<< Previous      Next >>

Microsoft   |   Windows   |   Visual Studio   |   Sharepoint   |   Azure