November 28, 2005 at 7:49 am
We're in the middle of a years long migration from Foxpro to Sql 2000. We're finding that the huge amount of business logic written in C# is ok for day to day processes, but when converting large client data sets from FoxPro to sql the current rate of 25,000 records per hour won't do.
I'm wondering if there are firms out there that would be able to handle the creation of tsql / procs for such conversions. Since the "status" of a record coming from FoxPro is already known I believe the conversion logic would be simpler than what the C# must handle for new records from our clients.
I'm told that each FoxPro record ( not a normalized database ) creates 50 records in the highly normalized sql database. Eventually we will get to the conversion of our larger clients with 1.5 million records and they need to be converted over a weekend.
We have one of the "new approach" solutions -- .net C# running on web servers -- tsql is generated on the fly by an object -- virtually no use of stored procedures. Database schema created by a tool ( visio I think ) so extreme normalization with very long field names. I wasn't involved in the design of this, just working on the conversion piece now.
SQL 2005 is being considered now but I'm not sure what it has that would make these conversions any faster.
Randy
November 29, 2005 at 8:35 am
Randy,
This is true that 1 FoxPro record creates a few normalized SQl records. You may want to do the following: Import FoxPro tables to SQL DB (do not create any SQl tables) then write SP or DTS from the SQL DB to your normalized SQl DB, make sure to pay attention to fields styructure.
You can also use BCP for fast import/export records but, be careful with memo fields.
We did the upsize from FoxPro to SQL and it not an easy and fast thing to do. If interested, please contact me via email so I can give you some directions.
Francisco Siu
FS
November 29, 2005 at 8:58 am
It sounds as if it is the business logic checking that is slowing you down. So, linking to the FoxPro tables and using DTS as Francisco suggested might be a good solution.
A commercial ETL product might be faster and more flexible. Pervasive has a tool that does this: http://www.pervasive.com. There are others too.
November 29, 2005 at 5:21 pm
Since your team comes from a file-based database background, I imagine you and they are using C# code as it is still procedural - you read in a bunch of records, do some processing, and then generate insert statements to pump that to SQL.
I had to write scripts to update from v1 of our product to v2 - v2 split the data out in a more flexible way requiring some denormalisation (eg, our system could have multiple files per document in v2).
Get someone familiar with TSQL to
a) put the data into temp tables in SQL (use DTS for this)
b) Use simple set-based methods to populate each of your target tables - you need to use some temp tables in the middle to store keys for any aggregated data or other new keys you made
It can be a long process and each situation is usually completely different. I've never used a tool for it - not sure if they are helpful or not?
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply