Home » Visual StudioRSS

BULK INSERT row terminator problem

Hi all.

I'm using BULK INSERT to load data from txt file and have this problem with the row terminator; I set '\n' as row terminator and ',' as field terminator. Now look at these 2 inputs for the txt file:

A,B,C,D

E,F,G,H

This goes smoothly and the data is loaded to the DB.

A,B,CD

E,F,G,H

Now here I have a problem - E and F are inserted as part of the first row. I would expect the last column in the 1st row to be NULL and the 2nd row to be E-F-G-H.

Anyone has any idea?

 

 

 

8 Answers Found

 

Answer 1

Can you post your BULK INSERT statement?  Do you specify KEEPNULLS in it?
 

Answer 2

Here it is:

 

DECLARE @CurrentTime DATETIME

DECLARE @CurrentTimeStr VARCHAR(50)

DECLARE @InputTxtFileLocation VARCHAR(100)

DECLARE @ErrorTxtFileLocation VARCHAR(100)

DECLARE @FieldDelimiter CHAR(1)

DECLARE @MaxErrors VARCHAR(10)

 

SET @CurrentTime = DATEADD(mm,0,GETDATE())

SET @CurrentTimeStr = dbo.toDateID(CONVERT(VARCHAR(50), @CurrentTime, 100))

SELECT @InputTxtFileLocation = PropertyValue FROM SystemGlobalVariables WHERE PropertyName = 'MobileDevicesInputFileLocation'

SELECT @ErrorTxtFileLocation = PropertyValue FROM SystemGlobalVariables WHERE PropertyName = 'MobileDevicesErrorFileLocation'

SET @FieldDelimiter  = CHAR(127)

SET @MaxErrors = '0'

 

SET @ErrorTxtFileLocation = REPLACE(@ErrorTxtFileLocation, '.txt', '_'+ @CurrentTimeStr+'.txt')

DECLARE @BulkStr NVARCHAR(MAX)

SET @BulkStr = 'BULK INSERT MobileDevicesStage FROM '''+@InputTxtFileLocation+''' WITH

(FIELDTERMINATOR = '''+@FieldDelimiter+''', ROWTERMINATOR = ''\n'', 

MAXERRORS = '+@MaxErrors+', ERRORFILE = '''+@ErrorTxtFileLocation+''')'

 

EXEC (@BulkStr) 

 

 

Answer 3

Add KEEPNULLS and see if it retains the column  formatting accuratly

FIELDTERMINATOR = '''+@FieldDelimiter+''', ROWTERMINATOR = ''\n\r'', KEEPNULLS,

I like to use new line and carriage also \n\r (as I've added above)

 

Answer 4

Didn't help unfortunately...

 

 

Answer 5

And if I use ''\n\r'' I get this error:

Msg 4866, Level 16, State 1, Line 1

The bulk  load failed. The column  is too long in the data  file for row  1, column 4. Verify that the field terminator  and row terminator are specified correctly.

Msg 7399, Level 16, State 1, Line 1

The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.

Msg 7330, Level 16, State 2, Line 1

Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

 

Answer 6

just noticed teh missing comma when trying your exact file  format.  either you'll need to process this before hand (SSIS would be a good option) or work with a format file possibly.

I'll see if I have the same situation with a format file and post it up. 

 

Answer 7

Any news?
 

Answer 8

(yaniv_yosef) writes:

I'm using BULK insert  to load  data from txt  file and have this problem
with the row  terminator; I set  '\n' as row terminator  and ',' as field  
terminator. Now look at these 2 inputs  for the txt file:   A,B,C,D   E,F,G,H   This goes smoothly  and the data  is loaded  to the DB.   A,B,CD   E,F,G,H   Now here I have a problem  - E and F are inserted as part  of the first
row. I would expect  the last column  in the 1st  row to be NULL and the
2nd row to be E-F-G-H.
Indeed you have a problem. BCP and BULK INSERT reads a binary stream, and
consumes one field at a time. They have no notion of "lines". \r\n for
them is just a field terminator. So with the sample data above you get
one record with the values A - B - CD\r\nE - F,G,H.

If you want something intelligent, you will have to write it on your
own.


-- Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se

Links for SQL Server Books Online:
SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx
SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx
SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

 
 
 

<< Previous      Next >>


Microsoft   |   Windows   |   Visual Studio   |   Follow us on Twitter