April 1, 2002 at 6:39 pm
Hi, I need to translate some oracle sql code to sql server. In my oracle code, I have this:
dbms_output.enable(1000000 )
then I print out my lines by using:
dbms_output.put_line ( my_line )
at the perl level, I am doing this:
$dbh->func(1000000, 'dbms_output_enable' )
execute the store procedure
then I can get the output string
by : my(@out_string) = $dbh->func('dbms_output_get' )
Any suggestion on how can I achieve the same thing in sqlserver? Or any other way to achieve the task? The reason I did that sql in oracle is that I need to grab various data from around 20 tables then put them in comma
separate format (csv format) and return the set of data to web.
Any suggestion? Thank you very much.
Abby Zhang
April 2, 2002 at 10:04 am
April 8, 2002 at 5:08 pm
Hi, putting my data into a temp table is a good idea. But I go this problem:
Because my data is pieced together from around 30 tables. After concating those data, each row's length might be around 32000 chars.
When I tried to create a temp table with couple of columns of varchar(8000), I got the warning:
the table mv_temp_table has been created but its maximum row size (32340) execeeds the maximum number of bytes per row (8060).
INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.
Any suggestion on the workaround is appreciated.
April 9, 2002 at 10:38 am
Use a text datatype or multiple tables/rows. Not sure how this will work with a bulk copy out, have to test it.
Steve Jones
April 9, 2002 at 6:27 pm
I created a table with a column of text type.
Insert some data. BCP works fine!
Thank you very much.
April 10, 2002 at 11:52 am
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply