June 7, 2016 at 6:56 am
I have a nightly job that sends out a csv file to our supplier. They then upload the data from there. The problem is that the field length is 20 characters. The actual data may be up to 20 but most often is less, maybe from 10-15 characters. The remaining length is filled by spaces which wrecks their import.
I tried RTRIM but no luck. Is there a way to limit the query length to actual data length instead of getting blank fillers at the end?
June 7, 2016 at 7:09 am
how does whitespace wreck the import?
importing a csv is fairly straight forward;
you could bulk insert directly to a staging table, and trim the data before it inserts into the permanent table.
if you use SSIS, you could create derived columns that do the trimming for you, and then use those values ot insert into your destination
Lowell
June 7, 2016 at 7:13 am
jcobb 20350 (6/7/2016)
I have a nightly job that sends out a csv file to our supplier. They then upload the data from there. The problem is that the field length is 20 characters. The actual data may be up to 20 but most often is less, maybe from 10-15 characters. The remaining length is filled by spaces which wrecks their import.I tried RTRIM but no luck. Is there a way to limit the query length to actual data length instead of getting blank fillers at the end?
What method are you using to create the CSV file?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
June 7, 2016 at 7:23 am
Do you have the target set up as a fixed width file?
June 7, 2016 at 7:27 am
ZZartin (6/7/2016)
Do you have the target set up as a fixed width file?
is the datatype of the source CHAR/NCHAR and not varchar/nvarchar?
exporting char values maintains their defined length.
you'd probably want to convert to varchar
Lowell
June 7, 2016 at 7:30 am
DECLARE @tab char(1) = char(9)
EXEC msdb.dbo.sp_send_dbmail
@recipients='xxxx@xxxx.com'
@subject= 'Inventory',
@query= 'SET NOCOUNT On;
SELECT "Item" AS Item_id
, "Qty" AS qty_on_hand
, "Acct" AS acct
, "Date" AS date_entered;
SELECT RTRIM(j.item_id)
, j.qty_on_hand
, j.acct
, j.date_entered
FROM mydb.dbo.jec_report j
WHERE j.date_entered = CONVERT(varchar(10), GETDATE()-1, 110)',
@body = 'Stock Report attached.',
@attach_query_result_as_file = 1,
@query_attachment_filename= 'Inventory.csv',
@query_result_separator = @tab,
@query_result_header=0
Here is what I'm pulling and sending out.
June 7, 2016 at 9:01 am
Lowell (6/7/2016)
ZZartin (6/7/2016)
Do you have the target set up as a fixed width file?is the datatype of the source CHAR/NCHAR and not varchar/nvarchar?
exporting char values maintains their defined length.
you'd probably want to convert to varchar
In the table I have, the field is varchar(20). Can it be cast as the data length somehow?
June 7, 2016 at 2:32 pm
Turns out it's in the syntax. By adding this: @query_result_no_padding= 1,
to the job step, I have the results limited to just the data character count and no trailing spaces.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply