December 30, 2024 at 11:53 pm
I have this script that puts all of the output from query to a single file. Could it be changed to create multiple files based on query results?
The query will find like a months worth of information like this in the table based on the Like condition.
20240501
20240502
20240503
Instead of creating 1 large file could it create a file for each hit it finds?
Thanks.
#Connection Strings
$Database = "CB_Compare"
$Server = "(local)"
#Export File
$AttachmentPath = "c:\fileloading\rmcstdetails.csv"
# Connect to SQL and query data, extract data to SQL Adapter
$SqlQuery = "
Select * from
RMcstDetails where ImageName like '202405%'"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Data Source=$Server;Initial Catalog=$Database;Integrated Security = True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$nRecs = $SqlAdapter.Fill($DataSet)
$nRecs | Out-Null
#Populate Hash Table
($DataSet.Tables[0] | ConvertTo-Csv -Delimiter "|" -NoTypeInformation) -replace """, "" |
Out-File -Force -Encoding utf8 $AttachmentPath
December 31, 2024 at 9:18 pm
Sure - depends on what you consider a 'hit'.
With that said - you have a lot of additional code here that isn't really needed. You can get rid of all of the setup for the connection and replace that with:
$results = Invoke-SqlCmd -Server $server -Database $database -Query "your query";
Then - you just need to filter the results to each file or create a loop, but that really depends on what a 'hit' means.
For example
$results | Where-Object {$_.SomeColumnName -eq "SomeValue"} | Export-Csv ...;
If a hit is based on some specific column - you could do something like:
$results | % {
$hit_column = $_.HitColumn;
Export-Csv -Path "c:\fileloading\$hit_column_details.csv" -NoTypeInformation -Append;
}
You can even include sorting to ensure the specific 'hit' column rows are sorted so the files are created/appended appropriately.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 1, 2025 at 12:58 am
thanks, what I mean by hit would be each time a NEW date is found then create a new output file. If I ran this query against table this is what I trying to achieve.
This imagename field stores a date in the first 8 positions. I want to create outputs files by day because they are large and I have another process to use them in and I need somewhat smaller files.
using example, it would find 20240501 and for all records matching that mask I want an output file called
20240501.csv.
It should find by day and create that same output file..
I hope this help explain.
Select
count(*) as 'ByDay',
SUBSTRING(ImageName,1,8) as 'Day'
from
tblb
group by SUBSTRING(ImageName,1,8)
order by 2
January 2, 2025 at 12:28 am
I can't help because I don't use PowerShell for much but I am curious. The output file in your original post is "c:\fileloading\rmcstdetails.csv". Where is the "c:\" drive physically located? On the SQL Server?
--Jeff Moden
Change is inevitable... Change for the better is not.
January 2, 2025 at 12:31 am
On my physical laptop ... just testing the process before using Network share.
January 2, 2025 at 12:35 am
On my physical laptop ... just testing the process before using Network share.
Is the "SQL Server" you're testing on also on your laptop?
--Jeff Moden
Change is inevitable... Change for the better is not.
January 2, 2025 at 12:59 am
Yes,
$Server = "(local)"
January 2, 2025 at 3:14 pm
Was there an issue with creating solution locally before deploy?
January 3, 2025 at 1:21 am
Jeffery and thoughts based on how the query gets data to the file to break it by day?
THanks.
January 3, 2025 at 4:22 am
Was there an issue with creating solution locally before deploy?
No... it's what happens when you deploy. The C: drive on the server should be avoided for this type of thing. When I say "avoided", I mean it much more strongly as in NEVER USE THE C: DRIVE OF THE SERVER FOR FILE PROCESSING. There should be another drive available for that type of thing.
As a bit of a side bar, a part of the reason why I don't know PowerShell very well is because I don't use it for this type of thing. The data you're writing to a file comes from a table on SQL Server and it's nearly trivial to write some dynamic code to make BCP commands for each date to fire of the BCP program that comes with SQL Server using an excursion into xp_CmdShell and that also means you can use SQL Agent to schedule the job and keep a history of what was done and when in a nice tight history table all through one stored procedure. Shoot... you can even call a zip program to bottle up the file and record the number of rows and files sizes and, and, and... and Bob's your uncle.
I haven't done such a thing in a few years but it's pretty easy.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 3, 2025 at 5:13 am
Thanks for response, would you be able to mock up an example of how that can be achieve? I just need an easy way to crank out files based off the query output.
Select * from
RMcstDetails where ImageName like '202405%'"
January 3, 2025 at 8:58 pm
Is there some reason you can't build a for-each loop in PS? I gave a basic outline for that - and it should be fairly easy to figure out from there.
If you already know the 'hit' dates - then build a loop over the dates and inside the loop export to CSV file. Using the variable for the loop by date in the filename.
@JeffModen - since Powershell is just a client the same as SSMS you can run it from a desktop, application server, database server, etc. Since we also have the PS subsystem - and the command subsystem (which I prefer for PS scripts) you can schedule agent jobs to run PS scripts very easily.
I absolutely agree with the C:\ drive comment - if the process is running on the server then you should have a dedicated volume that is not used for database files.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 3, 2025 at 9:12 pm
since its using the like instead of = to for hit dates would that still work?
Thanks.
January 3, 2025 at 9:22 pm
You have total control over the process - so why can't you just pass into the query the LIKE statement and use that to get only the rows from the query that you want in that specific file?
If you want to expand on it - use 2 queries. The first query returns a single DISTINCT column of 'hits' - the second query uses that 'hit' value to pull the rows that match that 'hit' value.
Loop over the results from the first query - using the second query to generate the output results - and output that to a file with the name generated based on the 'hit' value.
The logic is going to be the same - whether you implement it in SQL or PS and a loop of some type is going to be needed in either implementation.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply