Are the posted questions getting worse?

  • Solomon Rutzky (4/28/2016)


    Grant Fritchey (4/28/2016)


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    1) What do you mean by "large" amount of data? How many millions of rows and/or how many GB?

    Variable. Let's assume data warehouse size stuff.

    2) What is the ultimate goal here? What are you doing with the exported data?

    I work for Redgate. A tool vendor.

    3) Is this a one-time deal or will it be an on going process?

    Very ongoing, but not for any one individual or table. It's going to go into an app.

    4) Is the data straight from a single table, or a query combining columns from multiple tables, or a subset of fields from a single table?

    Single table. We have to put simplicity on this somehow. I suppose we could substitute a view for a table since this is pure export.

    5) If from a single table, are you wanting 100% of the rows? And if yes, is there an INT / BIGINT PK on the table?

    100%. Don't know if there will be an INT or not. I would assume we'd need to supply one through ROW_NUMBER or something

    6) Not SSIS is fine, but does it have to be SQLCMD?

    It needs to be generic and function well on 2008 or better. SQLCMD is the generic tool there, right?

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    Sure. Do the following.

    Send me an email to IMNotAPhisher@brandie.com with your full name, social security number, credit card number, CCV, expiration date, and birth date.

    As soon as I get that information, I'll post all the links you want.

    Except for my last crack, I'm actually serious. If you have some pointers, I need the help.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • ChrisM@Work (4/28/2016)


    Grant Fritchey (4/28/2016)


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    Table and index ddl, sample table scripts and execution plans please.

    RVVVVVF from Scary

    ALL the tables.

    No exec plans available.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant Fritchey (4/28/2016)


    Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    Sure. Do the following.

    Send me an email to IMNotAPhisher@brandie.com with your full name, social security number, credit card number, CCV, expiration date, and birth date.

    As soon as I get that information, I'll post all the links you want.

    Except for my last crack, I'm actually serious. If you have some pointers, I need the help.

    Sorry. I thought you were joking given where you posted. Sure. Give me a few minutes. I'll grab what I can.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Grant Fritchey (4/28/2016)[/b]


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    So the way someone I know does this stuff...

    Put an Admin table in one of your databases. This table keeps a list of locations and server names. Basically it's like this:

    CREATE TABLE dbo.Redgate_Admin

    (AdminID INT IDENTITY(1,1) NOT NULL,

    Key VARCHAR(50) NOT NULL,

    Value VARCHAR(100) NOT NULL);

    Each key is the definition of what you're defining, so a key called "ExportLocation" would have the literal or UNC path to where your files are getting located, etc. You might even define a BatchSize key so you can alter the number of rows you're exporting at will.

    At some point, you're going to have to determine how to track rows you've already exported. I recommend created a staging table that uses ROW_NUMBER or an IDENTITY column that basically numbers all your rows, import the data into there, and then maybe has a DoneNotDone column with a bit flag that switches when the data is exported. That way you kill two birds with one stone.

    There may be a better way of tracking that information, but I use this method because it's quick-n-easy to remember.

    Then open a text editor and create a file that ends in .cmd or .sqlcmd which imports your variables from the table and creates your sqlcmd statement. You can call it with a .bat file or just as is using the Operating System job step in SQL Agent.

    Again, this is the way I've seen it done. Maybe there's a better way, but this is the one I know. I'll see if I can dig up links to show how the import of the variables works since I haven't actually done this myself.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Grant Fritchey (4/28/2016)


    Solomon Rutzky (4/28/2016)


    Grant Fritchey (4/28/2016)


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    1) What do you mean by "large" amount of data? How many millions of rows and/or how many GB?

    Variable. Let's assume data warehouse size stuff.

    2) What is the ultimate goal here? What are you doing with the exported data?

    I work for Redgate. A tool vendor.

    3) Is this a one-time deal or will it be an on going process?

    Very ongoing, but not for any one individual or table. It's going to go into an app.

    4) Is the data straight from a single table, or a query combining columns from multiple tables, or a subset of fields from a single table?

    Single table. We have to put simplicity on this somehow. I suppose we could substitute a view for a table since this is pure export.

    5) If from a single table, are you wanting 100% of the rows? And if yes, is there an INT / BIGINT PK on the table?

    100%. Don't know if there will be an INT or not. I would assume we'd need to supply one through ROW_NUMBER or something

    6) Not SSIS is fine, but does it have to be SQLCMD?

    It needs to be generic and function well on 2008 or better. SQLCMD is the generic tool there, right?

    Ah, ok. Much clearer now. With respect to the overall context being "going into an app" and "generic", I can say:

    1) Currently, SQL# (the Full version only) contains a Stored Procedure called DB_BulkExport that can be scripted to do this. It can take any query, so output from a Stored Procedure even will work. But if you want to do ranges of data, then the raw query works better because you sort and chop-up the data however you like. It creates delimited output and can dynamically text-qualify columns of datatypes that need it, or text qualify all fields, or none.

    2) Within the next few months I will be releasing a stand-alone utility that is just for exporting. It will do everything that the SQL#.DB_BulkExport Stored Procedure does, plus allow for spanning multiple files automatically (including handling of the filename), and most likely allow for exporting to native .xlsx file(s).

    In both cases the size of the result set / table matters only in terms of needing that amount of disk space to export the data to. But I have personally used the SQL#.DB_BulkExport Stored Procedure to export tens of millions of rows from a table into a file that was over 4 GB, and then used SQL#.File_GZip, which can handle Zip64 format to go above 4 GB. The process uses minimal memory as it writes the rows to the file as they are read from the result set.

    SQL#https://SQLsharp.com/ ( SQLCLR library ofover 340 Functions and Procedures)
    Sql Quantum Lifthttps://SqlQuantumLift.com/ ( company )
    Sql Quantum Leaphttps://SqlQuantumLeap.com/ ( blog )
    Info sitesCollations     •     Module Signing     •     SQLCLR

  • Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)[/b]


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    So the way someone I know does this stuff...

    Put an Admin table in one of your databases. This table keeps a list of locations and server names. Basically it's like this:

    CREATE TABLE dbo.Redgate_Admin

    (AdminID INT IDENTITY(1,1) NOT NULL,

    Key VARCHAR(50) NOT NULL,

    Value VARCHAR(100) NOT NULL);

    Each key is the definition of what you're defining, so a key called "ExportLocation" would have the literal or UNC path to where your files are getting located, etc. You might even define a BatchSize key so you can alter the number of rows you're exporting at will.

    At some point, you're going to have to determine how to track rows you've already exported. I recommend created a staging table that uses ROW_NUMBER or an IDENTITY column that basically numbers all your rows, import the data into there, and then maybe has a DoneNotDone column with a bit flag that switches when the data is exported. That way you kill two birds with one stone.

    There may be a better way of tracking that information, but I use this method because it's quick-n-easy to remember.

    Then open a text editor and create a file that ends in .cmd or .sqlcmd which imports your variables from the table and creates your sqlcmd statement. You can call it with a .bat file or just as is using the Operating System job step in SQL Agent.

    Again, this is the way I've seen it done. Maybe there's a better way, but this is the one I know. I'll see if I can dig up links to show how the import of the variables works since I haven't actually done this myself.

    Excellent stuff. Thank you.

    The one trick we do need is how to track what has been exported. I'm pretty sure that's the hard part of the problem to solve efficiently.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Grant Fritchey (4/28/2016)


    Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)[/b]


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    So the way someone I know does this stuff...

    Put an Admin table in one of your databases. This table keeps a list of locations and server names. Basically it's like this:

    CREATE TABLE dbo.Redgate_Admin

    (AdminID INT IDENTITY(1,1) NOT NULL,

    Key VARCHAR(50) NOT NULL,

    Value VARCHAR(100) NOT NULL);

    Each key is the definition of what you're defining, so a key called "ExportLocation" would have the literal or UNC path to where your files are getting located, etc. You might even define a BatchSize key so you can alter the number of rows you're exporting at will.

    At some point, you're going to have to determine how to track rows you've already exported. I recommend created a staging table that uses ROW_NUMBER or an IDENTITY column that basically numbers all your rows, import the data into there, and then maybe has a DoneNotDone column with a bit flag that switches when the data is exported. That way you kill two birds with one stone.

    There may be a better way of tracking that information, but I use this method because it's quick-n-easy to remember.

    Then open a text editor and create a file that ends in .cmd or .sqlcmd which imports your variables from the table and creates your sqlcmd statement. You can call it with a .bat file or just as is using the Operating System job step in SQL Agent.

    Again, this is the way I've seen it done. Maybe there's a better way, but this is the one I know. I'll see if I can dig up links to show how the import of the variables works since I haven't actually done this myself.

    Excellent stuff. Thank you.

    The one trick we do need is how to track what has been exported. I'm pretty sure that's the hard part of the problem to solve efficiently.

    I hate to say it, but I think you'll need a table for that. Even if the table only contains unique identifiers and the dates, you'll still need a table for that.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • MSDN article on scripting variables with SQLCMD: https://msdn.microsoft.com/en-us/library/ms188714.aspx

    Dave Pinal's thoughts[/url] on the subject.

    DBA Stack Exchange http://dba.stackexchange.com/questions/13860/how-to-pass-in-parameters-to-a-sql-server-script-called-with-sqlcmd

    So, if I understand this correctly, you can use the SQLCMD code to pull the values from your admin table to set variables within the script and then run the script that exports the data required. It sounds pretty simple. You might not even need an extra .bat file because what I'm seeing is a command line (using OS job step) that directly calls the .cmd file, which is written a lot like a .bat file.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)


    Brandie Tarvin (4/28/2016)


    Grant Fritchey (4/28/2016)[/b]


    Technical question:

    I want to export large amounts of data. However, I don't want to use SSIS. I'm going to use SQLCMD.EXE. I need to be able to script these exports so that I do, let's say, 10,000 rows at a time, in chunks. I'm doing this so that I have the ability to easily restart, and, allow for starting to migrate or move the exported files as each chunk finishes.

    What's the best way to, in a relatively generic fashion, efficiently divide a table into these 10,000 row chunks?

    Articles, blog posts, some documentation to get this going in the right direction. Please!

    Oh, and, what the heck... URGENT!

    Ha!

    So the way someone I know does this stuff...

    Put an Admin table in one of your databases. This table keeps a list of locations and server names. Basically it's like this:

    CREATE TABLE dbo.Redgate_Admin

    (AdminID INT IDENTITY(1,1) NOT NULL,

    Key VARCHAR(50) NOT NULL,

    Value VARCHAR(100) NOT NULL);

    Each key is the definition of what you're defining, so a key called "ExportLocation" would have the literal or UNC path to where your files are getting located, etc. You might even define a BatchSize key so you can alter the number of rows you're exporting at will.

    At some point, you're going to have to determine how to track rows you've already exported. I recommend created a staging table that uses ROW_NUMBER or an IDENTITY column that basically numbers all your rows, import the data into there, and then maybe has a DoneNotDone column with a bit flag that switches when the data is exported. That way you kill two birds with one stone.

    There may be a better way of tracking that information, but I use this method because it's quick-n-easy to remember.

    Then open a text editor and create a file that ends in .cmd or .sqlcmd which imports your variables from the table and creates your sqlcmd statement. You can call it with a .bat file or just as is using the Operating System job step in SQL Agent.

    Again, this is the way I've seen it done. Maybe there's a better way, but this is the one I know. I'll see if I can dig up links to show how the import of the variables works since I haven't actually done this myself.

    Excellent stuff. Thank you.

    The one trick we do need is how to track what has been exported. I'm pretty sure that's the hard part of the problem to solve efficiently.

    I hate to say it, but I think you'll need a table for that. Even if the table only contains unique identifiers and the dates, you'll still need a table for that.

    Or maybe writing out to a log file.

    Thanks again.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • I think it is time for me to take a break.

    People can't format their code to make it readable, can't post enough information to really help them with out guessing at most of what they need, some get upset when you ask for more information, some don't give you what you ask for, and I find more and more irritating.

    I'll keep an eye here on the water cooler, but other than that, I think I'm done for a while.

  • Does anyone have The Ball today?

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil Parkin (4/28/2016)


    Does anyone have The Ball today?

    Interesting - not sure what they would do with a General Ledger.

  • Phil Parkin (4/28/2016)


    Does anyone have The Ball today?

    I was thinking

    PRINT '2016 Total: 700'

    "I cant stress enough the importance of switching from a sequential files mindset to set-based thinking. After you make the switch, you can spend your time tuning and optimizing your queries instead of maintaining lengthy, poor-performing code."

    -- Itzik Ben-Gan 2001

  • Phil Parkin (4/28/2016)


    Does anyone have The Ball today?

    Mine just reads "Maybe"

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 15 posts - 53,821 through 53,835 (of 66,712 total)

You must be logged in to reply to this topic. Login to reply