Alternative backup strategy for end users

  • The nature of business (address deduplication/matching) means we have SQL databases with large numbers of tables created on the fly from a visual foxpro application. This means our databases can grow very large very quickly.

    We need a method that allows end users to backup and remove specific tables from a database that are nolonger required (but with the option of restoring the same tables to the same database).

    At the same time want to avoid them having to anything via enterprise manager.

    We have contemplated writing something in our Visual foxpro front end that would bulk export tables to text files and zip these.

    Wondered if anyone had got any alternative ideas. I apprecciate this is a rather strange backup / restore scenario.

  • Depending on the size of the tables, I made this suggestion once for a seomwhat similar situation.

    Using ADO, select the recordset from the source (foxpro), save it in XML format using the saveStream function, store the resultant XML document as a text field in your DB. When you need the "table" back, you can retrieve the XML document and reconstitute a recordset from it. Now instead of managing tables, your users are managing rows in a table.

    Hokey: yes

    Slow: yes

    Works: yes

    Again, if your tables are large, I would be careful with this.

  • I'd go with the bulk export/input idea. This is clean and simple.

    Gregory Larsen, DBA

    If you looking for SQL Server Examples check out my website at http://www.geocities.com/sqlserverexamples

    Gregory A. Larsen, MVP

  • Thanks for the suggestion.

    The tables we are dealing with could potentially be large (millions of records) not sure it would be practical to save these SQL tables out to XML.

    Also we need some strategy for saving a maximum number of record from a table at anyone time to ensure our backup files are limited in size ie a table of 3 million could be backed up in batches of say 100,000 records

  • bcp is your best bet then.

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

  • Or use FoxPro to puul the data and break into individual files based on file size. By keeping a count on the ADO recordset when moving thru it you will know when to end one file and create the next. Or use a Persists recordset when saving but this again will be very big. Another popular way would be to DTS struture and data out to an Access DB file, of course it won't containt everything related to the table but it offers another option that can alos be read by Access without building a front end in FP.

  • quote:


    The nature of business (address deduplication/matching)


    you don't happen to work for a credit agency or an email spammer, do you?

    I can't think of any other folks who might be dealing with lots of addresses. Is this a data mining app? (Subscriptions maybe?) Credit cards? You've got me curious.

  • No not email spamming.

    We do data warehousing and mail address processing(deduplication,cleaning, suppresssions etc) for our clients marketing requirements.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply