Output to file

  • What T-SQL commands are available for writing and reading to/from a text file?

    I need a low tech interface to another application.

    Any advise as to what to do/not to do, use/not to use?

  • Use the 'bcp' command line utility.

    bcp "[database].[owner].

    " out NameOfOutputFile.Extension -c -U"<<ValidSQLUsername>>" -P"<<PasswordOfSQLUser>>" -S"<<SERVERNAME>>"

    Regards, Hans

  • Importing from a TextFile with T-SQL:

    BULK INSERT Test.dbo.inter_ExportJob FROM '<<NameOfOutputFile.Extension>>'

    Remember that the file has to be reachable from SQL server and be of the required structure. Look in BOL for defining other file structures then TAB separated columns and CR+LF separated rows (under Windows).

    Remember then importing large data volumes (GB and similar) this can take big amounts of time and hold locks for very long time which might not really be what you want. Experiment with the BATCHSIZE since it will increase the number of transacions (shorter duration, less lock time and so on) until you get to a reasonable balance between performance and lock duration(s).

    Regards, Hans!

    Edited by - hanslindgren on 05/28/2003 07:00:00 AM

  • These could work.... But I have to manipulate the data/text to a specific format that is not represented in the data tables themselves.

    In other words I would like to execute a query, get the data, use some string manipulation and then write the data to a file.

    On the other side I would like to read from a file and decide what to do with the data, (not straight into a table).

    Sorry I did not include enough details.

  • Export from Query:

    bcp "SELECT * FROM master..sysobjects ORDER BY name" queryout NameOfOutputFile.Extension -c -U"<<ValidSQLUsername>>" -P"<<PasswordOfSQLUser>>" -S"<<SERVERNAME>>"

    Import: Do a BULK INSERT to a #tbl, process you data and then do INSERT INTO <your Final Destination table> SELECT <columns|*>

    Hope that solvs your problem!

    Regards, Hans!

  • I think I can use the BCP option. Thank you.

    If it turns out that I can't and need to read or write (for example) a line at a time from within a SP, how would that be done?

  • Don't think you can read a line at a time with BCP.

    You can write a line at a time using xp_File and coud, read the entire file into a temp table using BULK INSERT or BCP and read the nth row from the table.

    Curious to see if there is another way as I use the suggested method.

    Cheers,

    Crispin

    Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!

  • The link for xp_File.

    http://www.sqlservercentral.com/products/jgama/XP_file/

    It's an extended proc so on for 2000

    Cheers,

    Crispin

    Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!

  • I followed the link...

    What am I downloading?

    Am I taking any risk by installing this?

  • The download butkton at the bottom of the page.

    It's an extended proc which you "install" on SQL.

    The explination he gives makes it easy to use.

    I have used it and not had any problems. It runs under SQL so no, it is not a virus or anything harmful.

    If you want to dump results to file, this is the tool to use.

    If you want to read a file from within a proc, BULK INSERT would be easier than BCP.

    Have a look in BOL on "System Stored Procedures"

    Cheers,

    Crispin

    Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!

  • I have tried compiling the xp_file but it fails, this may well be bcause I am using VC++, must it be compiled in C only?

  • Here I have a couple of cents to dice out:

    Writing a file line a time seems abit to much of the programmatic approach to SQL, doesn't it still mean that the whole table has to be dumped (line by line) sometime to the file? In my oppinion it is to do excessive work to install a third party XP for 'overwriting' or 'rewriting' a line in a file. Isn't this approach circumventing SS in a way? If the Export in T-SQL is the problem, why not launching xp_cmdshell (that already exists, is supported by MS and well documented) to start BCP to do your exportation? Personally I think the performance of re-dumping the (tmp) table to disk or trying to overwrite a specific line in the file differs very little. Keep in mind the additional overhead of parsing your file to find the line you want to overwrite and THEN overwrite it. Then again if you have a VLDB and 'chaining' togather a table and a file in this manner, performance cannot be one of your goals.

    /Hans

  • Dear friend

    Let me know one thing, are you trying for client side cursors or disconnected record sets. In that case I better suggest the use of DB library and ODBC API.But an ADO like implementation is not child’s play I think

    With regards

    John

  • I'm not sure if I understand your requirements of a "low tech interface", but if you're using T-SQL, then I'm guessing you're somehow connected to a SQL Server box. So, why not use a DTS package with a text file connection? You can then use transformations, using vbscript or javascript to manipulate your strings. The other application would just need to be able to "call" the package using the command line: dtsrun.exe ... "package name".

    I have several dts packages that read/write to text that I have put into .bat files. That way I can put an icon on the desktop linked to the .bat file, so that others in my company who have no knowledge of SQL can run the process - hence I make it low tech for them.

    Bill.

    OCP

  • Bill,

    This opens up new posibilities(sp?), for my solution.

    Thank you, I had not even considered this kind of an approach.

Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic. Login to reply