October 20, 2014 at 12:07 am
Comments posted to this topic are about the item What To Do When the Import and Export Wizard Fails (Part I)
October 20, 2014 at 2:52 am
Link error on page : This web page has a redirect loop
October 20, 2014 at 3:31 am
Yep - unable to see the article due to a redirect loop.
-------------------------------Oh no!
October 22, 2014 at 7:54 am
Typos in the text:
Four columns of data (not three) numbered 0-3; text says 10000 rows imported, screenshot shows 100000.
Not up to SSC's usual standard
October 22, 2014 at 8:08 am
Thanks Tim
I see one case where this line
On the line Copying to [dbo].[EmployeeFeed] you should see the Status Success and Message - “10000 rows transferred” in the Action column
should be replaced by this correct line
On the line Copying to [dbo].[EmployeeFeed] you should see the Status Success and Message - “100000 rows transferred” in the Action column
Could you please give me the entire sentence for the second case as well.
I would ask editor to fix it.
Thanks
October 22, 2014 at 3:58 pm
Users of the Import/Export wizard would be well advised to look at another tool that always ships with SQL Server... BCP. The (B)ulk (C)opy (P)rogram is a command-line program that has been bundled with every version of SQL Server since Sybase. It is the fastest extraction and loading tool that exists for SQL server and can be used in a variety of circumstances. I've used it in challenging situations that the Import/Export wizard was just either not up to the circumstances or was overkill. Exporting data using BCP in native format is an extremely efficient way to move data between SQL Servers, and is an excellent backup alternative for individual tables.
Cheers!
Old Guys Rule: Treachery and old age will overcome youth and skill.
October 22, 2014 at 4:44 pm
Brandon,
thank you very much for your input.
Yes, BCP is a good choice.
I did use it. But to use it with a scheduled job, it requires to use a scripting language, for example, I used Pearl.
In my current environment, I cannot use Pearl.
Can you suggest any alternatives of using BCP for scheduled jobs.
Thanks
October 22, 2014 at 5:27 pm
You don't need Pearl, you could use xp_cmdshell.
For inserting data, you could use pure T-SQL with BULK INSERT.
October 23, 2014 at 1:15 am
1. There is no need to write a VB script to find the maximum lenght for all the columns. You can just click the button "Suggest Types" in the wizard. This will scan the file for you and suggest data types accordingly. Be aware that this might take a while for large files.
2. Using 8000 as a length for each string column is overkill. Behind the scenes SSIS is used. SSIS uses buffers to transfer the data in the data flow. When estimating the size of the buffer, SSIS takes the maximum length of each column, meaning that it will take 8000 bytes (if non-unicode) for each column. This means you cannot get a lot of rows into one buffer - while the actual data might be a magnitude smaller - and the SSIS package will be slow. Real slow.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
October 23, 2014 at 6:27 am
Koen,
thank you very much for your input.
The max number of rows the Wizard can scan is 10,000.
I need the script when an input file has more than 10,000 rows.
Please confirm that on this we are on the same page.
Yes, 8,000 is overkill. One could start with 1,000 and go up if needed.
Unless, one is really under a stress and needs it to be done quick.
Thanks
October 23, 2014 at 7:19 am
Yakov Shlafman (10/23/2014)
Koen,thank you very much for your input.
The max number of rows the Wizard can scan is 10,000.
I need the script when an input file has more than 10,000 rows.
Please confirm that on this we are on the same page.
Yes, 8,000 is overkill. One could start with 1,000 and go up if needed.
Unless, one is really under a stress and needs it to be done quick.
Thanks
Nope, we're not on the same page 😀
That limit was present in SQL Server 2008 (R2). In 2005 it was 1000 rows, but it was lifted after a service pack.
I just tested it with a .csv file.
First I put 2 lines with a very small integer. Suggested data type was smallint.
Then I added 12000 lines and at the last line I changed the value from 1 to 10000000000000. I set the limit to 15000 and the suggested data type was bigint, hence more rows than 10000 were sampled.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
October 23, 2014 at 7:39 am
Thanks Koen.
Could you please specify on SQL Server 2008 R2 what service pack can handle more than 10,000?
Could you please provide a link to book on line/documentation.
Have you tried it on SQL Server 2012 or 2014?
Thanks
October 23, 2014 at 7:45 am
I got it from this thread:
There is no documentation on this limit, that's why I had to test.
It's possible the limit is now 20,000 for example, I wouldn't know.
I tested it on SQL Server 2012.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
October 23, 2014 at 7:53 am
Thanks Koen.
I read this article in the past and people do complain about the issue I did described in my article and I did show a solution. There is no indication that you can raise the bar above 10,000 rows in all versions of SQL Server 2008 R2 and below.
Unfortunately I do not have 2012 yet.
I think that people could benefit from my solution.
Thanks
October 23, 2014 at 12:20 pm
Yakov Shlafman (10/23/2014)
I think that people could benefit from my solution.
Sure. It's a pretty decent written article.
Hope to see more of you on this site.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
Viewing 15 posts - 1 through 15 (of 15 total)
You must be logged in to reply to this topic. Login to reply