September 21, 2016 at 7:23 am
Hello All,
Can someone walk me through the process of using BCP to import bulk data from excell or Flat files into a table in the SQL Server 2012 or 2014 versions.
I have tried resources online but i keep getting errors:
see my code below:
DECLARE @DbName NVARCHAR (10) =(select DB_NAME(db_id()))
IF OBJECT_ID('InventoryFillStaging', 'U') IS NOT NULL
DROP TABLE InventoryFillStaging;
CREATE TABLE InventoryFillStaging
([Metro rank] float(8), [Metro Name] nvarchar(255), [DMA#] float(8), [DMA Name] nvarchar(255),
[Calls] nvarchar(255), [Show Name] nvarchar(255), [Contract Start Date] datetime, [Add Date] datetime, [CP Name] nvarchar(255),
[Daypart] nvarchar(255), [Division Name] nvarchar(255), [Minutes] float(8), [Status] nvarchar(255), [Format] nvarchar(255),
[Group Name] nvarchar(255), [M] float(8), [T] float(8), [W] float(8), [T1] float(8), [F] float(8), float(8), [S1] float(8),
[StartTime] float(8), [EndTime] float(8), [MarketID] float(8),[StationID] float(8), [Priority] float(8), [SpotType] float(8),
[Length] float(8), [NumSpots] float(8) )
EXEC sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
EXEC sp_configure 'xp_cmdshell', 1;
GO
RECONFIGURE;
GO
EXECUTE master.dbo.xp_cmdshell 'BCP GrandRapids.dbo.InventoryFillStaging in "\\stratus-be\traffic\StagingFiles\WWOInventoryUpdated.csv" -t,-c-T-S'
SELECT * FROM dbo.InventoryFillStaging
I learn from the footprints of giants......
September 21, 2016 at 7:28 am
Can you paste the text of the error message, please?
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
September 21, 2016 at 7:37 am
Please, also share a sample of your input file.
Here's also the best explanation I've had on importing flat files: http://www.sqlservercentral.com/articles/BCP+(Bulk+Copy+Program)/105867/
September 21, 2016 at 7:44 am
output
User name not provided, either use -U to provide the user name or use -T for Trusted Connection
usage: BCP {dbtable | query} {in | out | queryout | format} datafile
[-m maxerrors] [-f formatfile] [-e errfile]
[-F firstrow] [-L lastrow] [-b batchsize]
[-n native type] [-c character type] [-w wide character type]
[-N keep non-text native] [-V file format version] [-q quoted identifier]
[-C code page specifier] [-t field terminator] [-r row terminator]
[-i inputfile] [-o outfile] [-a packetsize]
[-S server name] [-U username] [-P password]
[-T trusted connection] [-v version] [-R regional enable]
[-k keep null values] [-E keep identity values]
[-h "load hints"] [-x generate xml format file]
[-d database name]
NULL
I learn from the footprints of giants......
September 21, 2016 at 7:48 am
the fields and the columns are many, if i were to post here, it would be dissarranged,
heres a few of the fields...
Metro rankMetro NameDMA#DMA NameCallsShow NameContract Start DateAdd Date
68Grand Rapids, MI41Grand Rapids-Kalmzoo-Battle CrkWBBL-FMThis Week In NCAA12/28/20151/7/2016
68Grand Rapids, MI41Grand Rapids-Kalmzoo-Battle CrkWBBL-FMNFL Gameday Preview Vignettes9/5/20167/22/2016
68Grand Rapids, MI41Grand Rapids-Kalmzoo-Battle CrkWBBL-FMNFL Gameday Preview Vignettes9/5/20167/22/2016
68Grand Rapids, MI41Grand Rapids-Kalmzoo-Battle CrkWBBL-FMNFL Gameday Preview Vignettes9/5/20167/22/2016
68Grand Rapids, MI41Grand Rapids-Kalmzoo-Battle CrkWJRW-AMSomething To Think About with Mike Rogers1/1/20151/1/2015
I learn from the footprints of giants......
September 21, 2016 at 7:59 am
JALLYKAMOZE (9/21/2016)
outputUser name not provided, either use -U to provide the user name or use -T for Trusted Connection
usage: BCP {dbtable | query} {in | out | queryout | format} datafile
[-m maxerrors] [-f formatfile] [-e errfile]
[-F firstrow] [-L lastrow] [-b batchsize]
[-n native type] [-c character type] [-w wide character type]
[-N keep non-text native] [-V file format version] [-q quoted identifier]
[-C code page specifier] [-t field terminator] [-r row terminator]
[-i inputfile] [-o outfile] [-a packetsize]
[-S server name] [-U username] [-P password]
[-T trusted connection] [-v version] [-R regional enable]
[-k keep null values] [-E keep identity values]
[-h "load hints"] [-x generate xml format file]
[-d database name]
NULL
So presumably you tried providing a user name or adding -T?
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
September 21, 2016 at 8:06 am
You need to add spaces between the parameters, use the correct field delimiter (you stated comma but the sample uses tab), set the first row (to skip the header), set the server name and be sure that the columns in the table are the same as the columns in the file.
September 21, 2016 at 10:40 am
I changed the file to a .txt file, see the error i am still getting:
NULL
Starting copy...
SQLState = 22005, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0]Invalid character value for cast specification
NULL
43 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (43000.00 rows per sec.)
NULL
says data copied but not in database when i select from database.
see my sql script below:
IF OBJECT_ID('InventoryFillStaging', 'U') IS NOT NULL
DROP TABLE InventoryFillStaging;
CREATE TABLE InventoryFillStaging
([Metro rank] float(8), [Metro Name] nvarchar(255), [DMA#] float(8), [DMA Name] nvarchar(255),
[Calls] nvarchar(255), [Show Name] nvarchar(255), [Contract Start Date] datetime, [Add Date] datetime, [CP Name] nvarchar(255),
[Daypart] nvarchar(255), [Division Name] nvarchar(255), [Minutes] float(8), [Status] nvarchar(255), [Format] nvarchar(255),
[Group Name] nvarchar(255), [M] float(8), [T] float(8), [W] float(8), [T1] float(8), [F] float(8), float(8), [S1] float(8),
[StartTime] float(8), [EndTime] float(8), [MarketID] float(8),[StationID] float(8), [Priority] float(8), [SpotType] float(8),
[Length] float(8), [NumSpots] float(8) )
EXECUTE master.dbo.xp_cmdshell 'bcp [grandrapids].dbo.InventoryFillStaging in "\\stratus-be\traffic\STRATUS-DBDEV\StagingFiles\WWOInventoryUpdated.txt" -c -T'
SELECT * FROM dbo.InventoryFillStaging
I learn from the footprints of giants......
September 21, 2016 at 10:44 am
JALLYKAMOZE (9/21/2016)
I changed the file to a .txt file, see the error i am still getting:NULL
Starting copy...
SQLState = 22005, NativeError = 0
Error = [Microsoft][SQL Server Native Client 10.0]Invalid character value for cast specification
NULL
43 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1 Average : (43000.00 rows per sec.)
NULL
says data copied but not in database when i select from database.
see my sql script below:
IF OBJECT_ID('InventoryFillStaging', 'U') IS NOT NULL
DROP TABLE InventoryFillStaging;
CREATE TABLE InventoryFillStaging
([Metro rank] float(8), [Metro Name] nvarchar(255), [DMA#] float(8), [DMA Name] nvarchar(255),
[Calls] nvarchar(255), [Show Name] nvarchar(255), [Contract Start Date] datetime, [Add Date] datetime, [CP Name] nvarchar(255),
[Daypart] nvarchar(255), [Division Name] nvarchar(255), [Minutes] float(8), [Status] nvarchar(255), [Format] nvarchar(255),
[Group Name] nvarchar(255), [M] float(8), [T] float(8), [W] float(8), [T1] float(8), [F] float(8),
float(8), [S1] float(8),[StartTime] float(8), [EndTime] float(8), [MarketID] float(8),[StationID] float(8), [Priority] float(8), [SpotType] float(8),
[Length] float(8), [NumSpots] float(8) )
EXECUTE master.dbo.xp_cmdshell 'bcp [grandrapids].dbo.InventoryFillStaging in "\\stratus-be\traffic\STRATUS-DBDEV\StagingFiles\WWOInventoryUpdated.txt" -c -T'
SELECT * FROM dbo.InventoryFillStaging
Sounds like a data issue. Could one of the numeric columns contain non-numeric data in the input file?
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
September 21, 2016 at 10:52 am
Theres a column that has a header as Day part and the data like this:
Daypart
FrSu 6A-9P
Mo,Th 10A-3P
Mo,Th 3P-7P
Mo,Th 6A-10A
MoFr 10A-3P
MoFr 3P-7P
I learn from the footprints of giants......
September 21, 2016 at 11:22 am
Use the maxerrors parameter (-m) with a large value to catch the bad rows in an errorfile defined with the parameter -e.
September 21, 2016 at 2:56 pm
You don't need to use XP_CmdShell and BCP to import unless you're working around a privs issue.
And, to skip the headers, use the "First Row" option.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 21, 2016 at 10:11 pm
I import a lot of flat text and I would use BULK INSERT to do it from the server side and BCP to do it from client side. BCP doesn't like excel files and it doesn't like text qualifiers. So I always convert either of these file types in to tab separated values which is the default delimiter for both BCP and BULK INSERT. This simplifies things and removes requirement (in most cases) for text qualifiers.
BULK INSERT [schema].
FROM 'pathToFile' WITH (
FIELDTERMINATOR = '\t'
,ROWTERMINATOR = ''
);
September 22, 2016 at 7:40 am
david.leyden (9/21/2016)
I import a lot of flat text and I would use BULK INSERT to do it from the server side and BCP to do it from client side. BCP doesn't like excel files and it doesn't like text qualifiers. So I always convert either of these file types in to tab separated values which is the default delimiter for both BCP and BULK INSERT. This simplifies things and removes requirement (in most cases) for text qualifiers.BULK INSERT [schema].
FROM 'pathToFile' WITH (
FIELDTERMINATOR = '\t'
,ROWTERMINATOR = ''
);
Just a couple of notes here...
BCP and BULK INSERT handle so-said text qualifies exactly the same way. In your example above, both will leave a leading double quote if the first field in the file is double quoted. Neither will import a real .xls file. The best thing for Excel spreadsheet imports is the ACE driver and OPENROWSET.
Consistency is one of the most important things when importing a TSV or CSV exported from EXCEL. The blooding thing will only quote-delimit those things that contain a tab or a comma and the throws a monkey-wrench into both types of imports.
A work around is to quote-delimit everything. The ImportCSV and ExportCSV bits of Powershell are really good for that as a pre-processor and fairly decent for performance, as well.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply