March 30, 2017 at 10:34 am
Hi there,
I have a table that archives a process.
Here's the table structure
CREATE TABLE dbo.temp([ID] [int] NULL,
[TypeID] [int] NULL,
[DisplayValue] [varchar](50) NULL,
[Score] [real] NULL,
[CreatedDate] [smalldatetime] NULL)
CREATE NONCLUSTERED INDEX [ndxID] ON [dbo].[temp]
(
[ID] ASC,
[TypeID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
INSERT INTO dbo.temp VALUES(1,100,'Most likely',0.85,'2017-03-01 00:00:00')
INSERT INTO dbo.temp VALUES(1,200,'No',0,'2017-03-01 00:00:00')
INSERT INTO dbo.temp VALUES(1,300,'Yes',1,'2017-03-01 00:00:00')
TypeID is the ID for DisplayValue.i.e if TypeID is 100 then it is 'Most likely'.; 200 then it is 'No' and 300 then it is 'Yes'. Score is some 'analytical value' calculated.
This table takes weekly calculated values from dbo.weekly_temp and dumps it into dbo.temp
Each week the weekly table has about 400,000,000 records. So over the weeks, dbo.temp has grown to such large table it is not finishing executing this statement
INSERT INTO dbo.temp
SELECT ID,TypeID,DisplayValue,Score,InsertedDate
FROM dbo.weekly_temp
What's the best way to handle this
Thanks
Resh
April 2, 2017 at 6:27 pm
When you do your weekly dumps, are the IDs always larger than the largest ID in the dbo.Temp table or not?
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply