June 2, 2008 at 9:33 am
I need to move data from a large table (350GB/395,000 rows) to another table within the DB, but onto another disk.
My original table is set up like this:
USE [PD51_Data]
GO
/****** Object: Table [dbo].[SCANNEDDOCUMENTS] Script Date: 05/27/2008 11:26:36 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[SCANNEDDOCUMENTS](
[DocID] [int] IDENTITY(1,1) NOT NULL,
[CaseID] [int] NOT NULL,
[DocName] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[Doc] [image] NOT NULL,
[DocLocation] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[DocNotes] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[TopicID] [int] NULL,
[ScannedDocumentsCheckSum] [varchar](128) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
PRIMARY KEY CLUSTERED
(
[DocID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
ALTER TABLE [dbo].[SCANNEDDOCUMENTS] WITH NOCHECK ADD CONSTRAINT [ISCANNEDDOCUMENTS2] FOREIGN KEY([TopicID])
REFERENCES [dbo].[TOPICS] ([TopicID])
GO
ALTER TABLE [dbo].[SCANNEDDOCUMENTS] CHECK CONSTRAINT [ISCANNEDDOCUMENTS]
I created a second table on the SECONDARY FG like this:
USE [PD51_Data]
GO
/****** Object: Table [dbo].[SCANNEDDOCUMENTS2] Script Date: 05/27/2008 11:27:23 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[SCANNEDDOCUMENTS2](
[DocID] [int] IDENTITY(1,1) NOT NULL,
[CaseID] [int] NOT NULL,
[DocName] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[Doc] [image] NOT NULL,
[DocLocation] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
[DocNotes] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[TopicID] [int] NULL,
[ScannedDocumentsCheckSum] [varchar](128) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL
) ON [SECONDARY] TEXTIMAGE_ON [SECONDARY]
GO
SET ANSI_PADDING OFF
W/O the indexes.
My plan was to use import/export wizard to move the data over to the new table, while set to Bulk Log recovery, drop the original table, then create the indexes/constraints on the new one.
With this much data, I'm wondering what else I should do. I have to do this since I have a image data type.
Anyone have a better idea? What am I missing?
June 2, 2008 at 9:59 am
Try using Export/Import or DTS/SSIS package.
MCP, MCTS (GDBA/EDA)
June 2, 2008 at 10:06 am
Yeah that is my plan, but do you see anything else I need to get done or set for the SSIS package? I have never moved such a large amount of data.
June 2, 2008 at 10:14 am
Are you moving all data from this table? You could move the clustered index (rebuild on the new file) instead. Not sure if the image data moves, but I would hope so.
The bulk copy method might be best and fastest. Assuming you can prevent or detect changes while in process.
You can do it in stages, commit after xx rows as well to keep the logging down, perhaps allow you to pause or restart the load is an issue.
June 2, 2008 at 10:17 am
Steve Jones - Editor (6/2/2008)
Are you moving all data from this table? You could move the clustered index (rebuild on the new file) instead. Not sure if the image data moves, but I would hope so.The bulk copy method might be best and fastest. Assuming you can prevent or detect changes while in process.
You can do it in stages, commit after xx rows as well to keep the logging down, perhaps allow you to pause or restart the load is an issue.
No it will not move the image data. Yeah that is another issue I was thinking about, pausing the load. And I was wondering how to move data in stages.
June 2, 2008 at 11:06 am
Take a snapshot of the Table and subscrib to another.
MCP, MCTS (GDBA/EDA)
June 2, 2008 at 11:09 am
Manoj (6/2/2008)
Take a snapshot of the Table and subscrib to another.
Now I am lost!
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply