February 8, 2022 at 6:59 pm
I'm not opposed to xp_cmdshell, and was wanting the csv files to be pipe(|) delimited. These 3 files will be used in a P/S import process to SQL as kind of a poor-man-replication. The source which is collecting the information is being updated\inserted so heavily I'm going to run this new process for previous day to update a backup/report server.
Curious - this is what SSIS is designed to do and would be much easier to implement than using PS. Even using BCP to output a file - and BCP to import the file would be much better.
And if you are going from SQL Server to SQL Server - then BCP in native format or SSIS using native will be even more efficient.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
February 8, 2022 at 8:23 pm
wanted more of a script approach maybe a little easier for maint and debug down the road.
February 8, 2022 at 8:24 pm
I'm not opposed to xp_cmdshell, and was wanting the csv files to be pipe(|) delimited. These 3 files will be used in a P/S import process to SQL as kind of a poor-man-replication. The source which is collecting the information is being updated\inserted so heavily I'm going to run this new process for previous day to update a backup/report server.
You could use the -o option of SQLCmd even from PowerShell to write to a file.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 8, 2022 at 8:29 pm
I was looking back at this example you sent:
$queryResults = Invoke-SqlCmd ...;
Then something like this would be one option (probably better options are available):
$queryResults | ? {$_.Category -eq 1} | Export-Csv ...;
$queryResults | ? {$_.Category -eq 2} | Export-Csv ...;
$queryResults | ? {$_.Category -eq 3} | Export-Csv ...;
How do I get the results of the CASE 1,23 into my pipe to export the different time based files?
February 8, 2022 at 8:42 pm
It would really help if you would at least try testing a solution
$queryResults = Invoke-Sqlcmd -ServerInstance servername `
-Query "Select id = 1, Category = 1, ColumnValue = 'This is row 1'
Union All Select id = 2, Category = 2, ColumnValue = 'This is row 2'
Union All Select id = 3, Category = 3, ColumnValue = 'This is row 3'";
$queryResults | ? {$_.Category -eq 1} | Export-Csv C:\Windows\Temp\outputfile1.csv -NoTypeInformation;
$queryResults | ? {$_.Category -eq 2} | Export-Csv C:\Windows\Temp\outputfile2.csv -NoTypeInformation;
$queryResults | ? {$_.Category -eq 3} | Export-Csv C:\Windows\Temp\outputfile3.csv -NoTypeInformation;
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
February 8, 2022 at 8:52 pm
Just curious... I might have missed it in the now 4 pages of post but how many rows per day are you looking to do this for?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 8, 2022 at 9:07 pm
not sure I understand the question, but I'm looking all records from previous day that gets split out using:
CASE WHEN datepart(hour, dt.dt_stamp) < 8 THEN 1
WHEN datepart(hour, dt.dt_stamp) < 16 THEN 2
ELSE 3
END
I want 3 output files...
Based upon original qry
Declare @start_date datetime = '20220207',
@end_date datetime = '20220207';
Declare @ts_start bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@start_date As date))
, @ts_end bigint = cast(1000 As bigint) * datediff(second, 25567, cast(@end_date + 1 As date));
Select *
From sqlt_data_1_2022_02 dc With (NoLock)
Cross Apply (Values (dateadd(millisecond, dc.t_stamp % 1000, dateadd(second, dc.t_stamp / 1000, 25567)))) As dt(dt_stamp)
Where dc.t_stamp >= @ts_start
And dc.t_stamp < @ts_end;
February 8, 2022 at 10:26 pm
My question is simple... how many rows per day will that normally include?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 9, 2022 at 1:22 pm
10,000 of thousands I expect for each of the 3 files...
February 9, 2022 at 5:22 pm
10,000 of thousands I expect for each of the 3 files...
I'm not sure what that means. I just want to know the total number of rows that the entire process has to handle each day.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 9, 2022 at 5:40 pm
100,000 rows
February 11, 2022 at 2:28 am
Did the number of rows per day dictate a certain method for export?
February 11, 2022 at 2:55 am
Did the number of rows per day dictate a certain method for export?
It does, indeed and the import, as well. And, just to be sure, this is a pure data transfer, correct? In other words, everything that you export will be imported on the other side without having to suffer the rigors of an "Upsert", correct? And, if you don't know what the jargon of "Upsert" implies, an "Upsert" has to figure out what exists and what does not exist and then determine which rows must be updated and which rows must be inserted. You're not doing any of that on the receiving end of this data because you're going to import ALL of the data from a daily "set" of "CSV" files, correct?
One other question... you're exporting the data from SQL Server. Is the consumer of the data also an SQL Server? And, yeah... that's a wicked import question.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 11, 2022 at 3:00 am
Thanks for replies...
Yes it's SQL to SQL, and no ETL on the data from export to import.
February 11, 2022 at 3:03 am
With that thought in mind, I have to ask, why are you messing around with the likes of "CSV" files and PowerShell, etc? It's a simple thing to BCP a table out in "Native" format, which makes no mistakes like you will with CSV conversions, etc, etc. On the other side, the just do another BCP into the table and Bob's your uncle.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 46 through 60 (of 64 total)
You must be logged in to reply to this topic. Login to reply