Invoke-SQLCMD

  • Bruin wrote:

    In my original Post that's what I was trying to accomplish ..

    No - your original request is attempting to connect to a single server and pull data *across a linked server*.  That is not the same as connecting directly to the source - and having a separate connection to the destination.

    The problems you are encountering is because the SQL statement is not quoting the server name in the 4-part name using the linked server.

    Your version:

    $Query = "SELECT CAST(sqlToExec AS NVARCHAR(MAX)) FROM at-ts-cblsql.CB_SandBox.dbo.crtDelete where ID > 0 and ID < 1000000"

    Invoke-SqlCmd -ServerInstance "at-sqltest" -Database "Cost" -Query;

    Direct Connect:

    $Query = "SELECT CAST(sqlToExec AS NVARCHAR(MAX)) FROM dbo.crtDelete where ID > 0 and ID < 1000000"

    Invoke-SqlCmd -ServerInstance "at-ts-cblsql" -Database "CB_SandBox" -Query;

    In the direct version - you connect directly to the instance and execute the query.  In your version - you connect to a different instance to run a query across a linked server.  There is no reason to connect to the destination server to get the list of commands since you can connect directly to that instance.

    Once you have that list - then you can connect to the destination and execute each statement.  There isn't any benefit to 'batching' the commands since they will be executed one by one no matter how you structure it.

    Based on you earlier response - it isn't even clear you have the right server names, since you can connect to 'localhost' but cannot connect to 'at-ts-cblsql' - and not even sure those are the same instances.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Would it be any better if I dumped to .sql file from Source then Process that file on Target? Would it be any quicker?

    How could I keep track of which ID I processed?

    $sourceInstance = "at-ts-cblsql";
    $sourceDatabase = "CB_SandBox";

    $sourceQuery = 'SELECT id,sqlToExec = CAST(d.sqlToExec AS nvarchar(max)) FROM dbo.crtDelete d order by d.id';


    $sqlToExec = Invoke-SqlCmd -ServerInstance $sourceInstance -Database $sourceDatabase -Query $sourceQuery;

    $sqlToExec | Out-File 'c:\cb\DeletesToProcess.sql'
  • You don’t need to use a file for the delete records, you have a memory resident recordset which you can loop through, could then output this status to a log along with processed flag

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • Okay thanks. this seems to perform the task at hand, but is there any way using the ID field to maybe multi-thread the process instead of RowByRow?

    I'm using PS5.1

    $stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
    # Define your SQL Server instance and database
    $ServerInstance = 'at-ts-cblsql'
    $Database = 'cb_sandbox'


    $TargetInstance = 'at-sqltest'
    $TargetDatabase = 'Cost'


    # Define your query to read data from your source table
    $Query = "SELECT CAST(sqlToExec AS NVARCHAR(MAX)) FROM dbo.crtDelete where (ID > 0 and ID < 1000000) order by ID"

    $counter = 1

    # Execute the query and store the results
    $Data = Invoke-Sqlcmd -ServerInstance $ServerInstance -Database $Database -Query $Query


    # Loop through each row of data and insert into the destination table
    $Data | ForEach-Object {
    $query = $($_.Column1)
    $error.clear()


    Invoke-SQLCMD -Query $query -ErrorAction Continue -ServerInstance $TargetInstance -Database $TargetDatabase -OutputSqlErrors $true -verbose *> $null



    if ($error -ne $null)
    {
    $query + '' + $error | Out-File -FilePath "E:\Temp\Deletes_Errors.log" -Append
    }

    else
    {
    $query | Out-File -FilePath "E:\Temp\Deletes_Processed.log" -Append
    }

    Write-host "Processing Item $Counter of $($Data.count)"
    $Counter++
    }
    $stopwatch.Stop()
    Write-Host "Total execution time: $($stopwatch.Elapsed.TotalSeconds) seconds"
  • sure you can - ask a good developer on your company to do it

    and you can always try it yourself by googling and trying/testing/adapting the many many examples you can find for parallel execution. https://www.google.com/search?q=powershell+execute+queries+in+parallel&oq=powershell+execute+queries+in+parallel&gs_lcrp=EgZjaHJvbWUyBggAEEUYOTIHCAEQIRigATIHCAIQIRigATIHCAMQIRifBTIHCAQQIRifBdIBCTE1MjQwajBqN6gCALACAA&sourceid=chrome&ie=UTF-8

  • Please review the script I posted - that you ran and successfully returned a result.  You will notice that query returns the ID along with the SQL to execute - and can be used inside the loop to delete the data from the source table.

    As to running in parallel - that is going to be some advanced PS scripting which I fear you are not going to be able to implement without a lot of help and input from an experienced developer.

    Instead of immediately moving to a parallel strategy - it might be a better option to at least get a working solution that you understand and can maintain.  Once you have that then you can start looking at various options - if needed.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Thanks Jeffrey..

    Yes, I'll review the one with ID included. The script I posted above does archive the desired results. I was thinking that I could just issue

    a Truncate after processing. What's your thought's on that approach?

    Thanks again.

  • Bruin wrote:

    Thanks Jeffrey..

    Yes, I'll review the one with ID included. The script I posted above does archive the desired results. I was thinking that I could just issue a Truncate after processing. What's your thought's on that approach?

    Thanks again.

    I would not use truncate for this type of process - what would happen if one or more of the statements to be executed failed?  If you truncate the table then you no longer have any way of retrieving those statements that failed and executing them again.

    Since this has to be done one by one - then after you process a row from the source successfully against the destination - you then delete that row from the source.

    Truncating the table assumes all rows have been processed, but what about all of the additional rows that have been created since you pulled the previous set of rows?  Those won't be in your results - then you truncate the table - and those unprocessed rows just disappear without ever getting processed.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 8 posts - 16 through 22 (of 22 total)

You must be logged in to reply to this topic. Login to reply