November 9, 2022 at 3:59 pm
Hi,
I am working to move data from SQL Server to BigQuery. I tried exporting the .csv data out via SSMS (Tasks + Export Data), but noticed that all the columns that have numeric data types have leading zeroes missing (.0000053653 instead of 0.0000053653). This is causing issues when I try to import into BigQuery. I don't see any export options in SSMS to change that.
Any suggestions?
Thank you in advance!
November 9, 2022 at 4:38 pm
I tried and replicated the issue. I read somewhere that using numeric instead of decimal solves the problem, but that made no difference for me. The only thing I can think of is to use a query as the source and cast/convert the numerics to varchar.
November 9, 2022 at 6:09 pm
Hi Ed,
Thank you for your feedback.
Cast/converting numerics to varchar seems to have solved the issue.
Thanks!
November 9, 2022 at 6:33 pm
I have to ask... why is it important to have leading zeros on this particular bit of numeric data in the CSV? Or is it not a CSV file after all?
--Jeff Moden
Change is inevitable... Change for the better is not.
November 9, 2022 at 9:28 pm
Hi Jeff,
Without the leading zeroes (e.g. instead of 0.123 it would show in the .csv file .123), I could not load the file into BigQuery. Import function was giving me an error when trying to load this into a numeric field.
Thanks!
November 9, 2022 at 9:52 pm
Hi Jeff,
Without the leading zeroes (e.g. instead of 0.123 it would show in the .csv file .123), I could not load the file into BigQuery. Import function was giving me an error when trying to load this into a numeric field.
Thanks!
My apologies... the day's coffee is wearing off. You meant the zeros AFTER the decimal point. That seems like an interesting bug that they would go missing during the export for any reason.
Lordy.... that's not what you meant either. Sorry. I get it now.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply