March 16, 2009 at 8:18 am
Hello,
I have created a program that reads data from a flat file (delivered by a supplier).
This file should be treated every month or so.
But because there is a lot of manipulation to make on the data in the file i decided to make a vb.net program that would work with this file. get the data that i need row by row and perform some manipulations on this row
Then execute a stored procedure.
The recordcount of this file is around 400.000.
My taskmanager shows no heavy load (4% avg) on my laptop and a stable memoryconsumption for the vb.net program.
what i do notice however is that the first 5 a 10.000 rows where getting much faster inserted then the ones following
Stats are :
start with empty table : avg: 6000 records / minute
when there where 100.000 inserted then the stats where : 1000 inserts/minute
when there where approx 200.000 records inserted : it further fail back to about 600/minute
I have disabled all non clustered indexes on the table for the load
Have also tried to disable the Clustered index but at that point the stored procedure could not find any plan so inserting failed..
My guess is that the more records there are in the table the slower the index get
Is there any way to disable the index and still get the inserts to work.?
or should i disable the foreign key references .?
Table structure :
USE [develop]
GO
/****** Object: Table [dbo].[ARTIKELMAGAZIJN] Script Date: 03/16/2009 15:13:03 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[ARTIKELMAGAZIJN](
[ARM_ID] [int] IDENTITY(1,1) NOT NULL,
[FAM_ID] [int] NULL,
[KKM_ID] [int] NULL,
[ARM_REMOVED] [bit] NOT NULL CONSTRAINT [DF_ARTIKELDD_ARD_REMOVED] DEFAULT ((0)),
[ARM_ARTIKEL_NR_DD] [varchar](25) COLLATE Latin1_General_BIN NOT NULL,
[ARM_ARTIKEL_OMS_DD] [varchar](80) COLLATE Latin1_General_BIN NULL,
[ARM_STOCKCONTROLE] [bit] NOT NULL,
[ARM_VERPLICHT_MIN_IN_STOCK] [int] NOT NULL,
[ARM_VERPLICHT_MAX_STOCK] [int] NOT NULL,
[ARM_AUTOMATISCH_BESTELLEN] [bit] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AUTOMATISCH_BESTELLEN] DEFAULT ((0)),
[ARM_AANTAL_IN_STOCK] [int] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AANTAL_IN_STOCK] DEFAULT ((0)),
[ARM_AANTAL_IN_BESTELLING] [int] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AANTAL_IN_BESTELLING] DEFAULT ((0)),
[ARM_BARCODE] [int] NULL,
[ARM_LASTUPDATE] [datetime] NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_LASTUPDATE] DEFAULT (getdate()),
[ARM_ACTIEF] [bit] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_ACTIEF] DEFAULT ((1)),
CONSTRAINT [[[PKCU-ARTIKELMAGAZIJN-ARM_ID] PRIMARY KEY CLUSTERED
(
[ARM_ID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
USE [develop]
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] WITH NOCHECK ADD CONSTRAINT [FK_ARTIKELMAGAZIJN_FAMILIE_OP_FAM_ID] FOREIGN KEY([FAM_ID])
REFERENCES [dbo].[FAMILIE] ([FAM_ID])
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] CHECK CONSTRAINT [FK_ARTIKELMAGAZIJN_FAMILIE_OP_FAM_ID]
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] WITH NOCHECK ADD CONSTRAINT [FK_ARTIKELMAGAZIJN_KORTINGMAGAZIJN_OP_KKM_ID] FOREIGN KEY([KKM_ID])
REFERENCES [dbo].[KORTINGMAGAZIJN] ([KKM_ID])
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] CHECK CONSTRAINT [FK_ARTIKELMAGAZIJN_KORTINGMAGAZIJN_OP_KKM_ID]
The Stored procedure:
ALTER PROCEDURE [dbo].[sproc_tmpImportMercedesArtikelen] (
@artikelNr varchar(25) = '',
@artikelOmschrDD varchar(80) = '',
@stockControle BIT = 0,
@actief BIT = 1,
@minInStock INTEGER = 0,
@maxInStock INTEGER = 0,
@aantalInBestelling INTEGER = 0,
@Klasse VARCHAR(2) ='N',
@barcode INTEGER ,
@autoBestellen BIT = 0,
@msgstat VARCHAR(255) output
)
AS
BEGIN
DECLARE @KKM_ID INTEGER,
@FAM_Id INTEGER,
@ARM_ID INTEGER
SELECT @KKM_ID = KKM_ID FROM [KORTINGMAGAZIJN] AS k WHERE k.[KKM_KLASSE] = @Klasse
--SELECT @FAM_Id = FAM_ID FROM [FAMILIE] AS f WHERE [FAM_CODE] ='MB'
SET NOCOUNT ON;
BEGIN TRY
SELECT @ARM_ID = [ARM_ID] FROM [ARTIKELMAGAZIJN] AS a WHERE a.[ARM_ARTIKEL_NR_DD] = @artikelNr
IF @ARM_ID > 0
BEGIN
-- update
UPDATE [ARTIKELMAGAZIJN]
SET [ARM_ARTIKEL_OMS_DD] = @artikelOmschrDD,
[ARM_STOCKCONTROLE] = @stockControle,
[ARM_ACTIEF] = @actief,
[ARM_VERPLICHT_MIN_IN_STOCK] = @minInStock,
[ARM_VERPLICHT_MAX_STOCK] = @maxInStock,
[ARM_AANTAL_IN_BESTELLING] = @aantalInBestelling,
[FAM_ID] = 3, --@FAM_Id
[ARM_BARCODE] = @barcode,
[ARM_AUTOMATISCH_BESTELLEN] = @autoBestellen,
[ARM_LASTUPDATE] = GETDATE()
WHERE [ARM_ID] = @ARM_ID
SELECT @msgstat = 'Alles OK - Update ArtikelMagazijn'
END
ELSE
BEGIN
INSERT INTO [ARTIKELMAGAZIJN] (
[FAM_ID],
[KKM_ID],
[ARM_REMOVED],
[ARM_ARTIKEL_NR_DD],
[ARM_ARTIKEL_OMS_DD],
[ARM_STOCKCONTROLE],
[ARM_VERPLICHT_MIN_IN_STOCK],
[ARM_VERPLICHT_MAX_STOCK],
[ARM_AUTOMATISCH_BESTELLEN],
[ARM_AANTAL_IN_STOCK],
[ARM_AANTAL_IN_BESTELLING],
[ARM_BARCODE],
[ARM_LASTUPDATE],
[ARM_ACTIEF]
) VALUES (
/* FAM_ID - int @FAM_Id*/ 3,
/* KKM_ID - int */ @KKM_ID,
/* ARM_REMOVED - bit */ 0,
/* ARM_ARTIKEL_NR_DD - varchar(80) */ @artikelNr,
/* ARM_ARTIKEL_OMS_DD - varchar(50) */ @artikelOmschrDD,
/* ARM_STOCKCONTROLE - bit */ @stockControle,
/* ARM_VERPLICHT_MIN_IN_STOCK - int */ @minInStock,
/* ARM_VERPLICHT_MAX_STOCK - int */ @maxInStock,
/* ARM_AUTOMATISCH_BESTELLEN - bit */ @autoBestellen,
/* ARM_AANTAL_IN_STOCK - int */ @autoBestellen,
/* ARM_AANTAL_IN_BESTELLING - int */ @aantalInBestelling,
/* ARM_BARCODE - int */ @barcode,
/* ARM_LASTUPDATE - datetime */ GETDATE(),
/* ARM_ACTIEF - bit */ @actief )
SET @ARM_ID = scope_identity()
SELECT @msgstat = 'Alles OK - Insert ArtikelMagazijn: nieuw ID = ' + CONVERT(VARCHAR(10),@ARM_ID)
END
END TRY
BEGIN CATCH
SELECT @msgstat = ERROR_MESSAGE()
END CATCH
END
Tnx for looking into this problem.
Wkr,
Eddy
March 16, 2009 at 2:44 pm
As far as SP is concerned all seems good. Maybe you can tweak it a bit to improve on performance.
1) You could store the getdate() in a variable and use that variable in the query.
Also figure out whether the delay is on the SQL Server or the .net app. To test it what could be done is to dump the massaged data to a staging table and use the same procedure from the app and check the performance.
March 18, 2009 at 1:17 pm
Hi Sanjay,
Thank you for your response.
Have done several tests and it seems that it was my .net console application that was slowing down.
Even though there was no memory loss nor was there increasmend in cpu nor even in I/O write/Reads
With other words i don't know why it slows down,
My best guess is that its due to the logging on the consolewindow.
It is a lot of text to display, 2 lines per threated record..
On the sqlserver side, i noticed one (of 4) cpu's running at 100%, but the server was responding verry quickly to any other request, no slow downs there
Wkr,
Eddy
March 18, 2009 at 1:38 pm
Have you looked at Network congestion? You're sending a lot of data across the wire from your laptop where your app is running to the server. Where is your text file stored, your machine or the server?
Depending on your data and how many changes need made to it, you might think about using one of the bulk insert techniques specifically created for import large chunks of data. SSIS seems like a logical choice as it would allow you to do just about anything you are currently doing in your VB.Net app. Also, you could consider importing to a staging table with BCP and then doing your data conversion from there.
The biggest factor however is that it seems you are processing 400000 inserts in a row by row approach. You will really see much better performance by doing this in a set based approach.
I'm about ready to run out the door at the moment, but hopefully later tonight I'll get to take a closer look at your code and see if I can't come up with a way to do the insert all in one shot. You might want to think along those lines as well.
-Luke.
March 19, 2009 at 4:12 am
Hi Luke,
The networkcongestion is fine, server is running on dual gigabit,
My laptop is conencted on 1 gbps link.
The text file is localy stored on my laptop so idd i have to send over a lot of data.
Have been thinking on a bulk insert approach also and that works
410.000 records uploaded in 14 sec's, thats enormasly fast, i was shocked by the performance that gives but
Due to the conversions needed to make and my 'novice' skills in T-Sql programming i have chosen for the row by row approach via a .net console.
That way i have control of the flow.
I also do know that sqlserver itself likes set based solutions better cause its mainly working in datasets and have also been thinking on SSIS but also that is new for us.
Have been busy with SqlServer for a small year now, and have yet much to learn :hehe:
Would really appreciate an example of manipulate the data in a set based manner
Or if someone could provide an example of an SSIS package in witch they handle foreignkey lookups and such, something that i could use as startpoint for loads like these, that would be awesome.
Wkr,
Eddy
March 19, 2009 at 6:38 am
Sorry, I'd planned to look into this last night, and just lost track of time... You say your issue is with FK lookups and such, any chance you can get me some DDL for the additional tables and also some sample data to play with. Check out This Link[/url] for how to properly post the DDL and sample data.
When I was talking about network issues, it wasn't so much congestion I Was thinking about, but instead of doing 1 insert, with a request and an answer, you are doing 400,000 individual inserts with 400,000 requests and 400,000 responses. That all adds up to being much more costly from a network, CPU and IO standpoint than a single large insert. Think of it this way, for each row that succeeds, you return a response that's 46 characters plus a varchar(10). 56*400,000/1024/1024 = 21.4 MB's of just success messages.
Add to that the fact that you may be crushing your transaction log because instead of logging once bulk operation it's logging each and every insert, so it may come to a point where it needs to autogrow and this could be one of the causes behind your slowdown as your job progresses.
I understand you want to use tools familiar to you and so you chose a VB.Net client application, but I would strongly suggest you look into other means. I'm not terribly strong with SSIS, as I'm mostly a straight TSQL kind of guy, and I know how I would handle this in SQL 2000 but I'm not 100% sure for 2005 particularly since you didn't provide any sample data. Also what I might do to accomplish this would be to do the bulk insert to a staging table since it only takes 14 Seconds, and then do the insert into your production tables, but someone else may prefer to do it all in SSIS. Will you be the one doing this import each month, or will this be tasked to some other non-admin users? Just something else to think about as it might change the way we approach it.
It looks like you could just join directly to your FK tables and do a fairly straightforward insert. Hit me with some sample data and DDL for your associated tables, and I'll see what I can do.
-Luke.
March 23, 2009 at 6:18 am
Hi Luke,
Sorry for the delay, Have been going home on thursday afternoon cause i was getting sick, was stuck with the flew..., but now I'm back, not for the full 100% but still back is back don't you agree 😉
Then maybe its more appropriate that i send you this example of something that needs to be read in also every month, its for the same supplier and have more or less the same amount of data but is for me a bit complicated to do it in 1 bulk update/insert
Espcecialy because there is a need of calculating prices involved.
I have not yet transformed this in a stored procedure but it should be the idea is that when the script is off it should run as a sproc.
For the moment it runs a bit slow.
The inserting into the temp table takes about 20-30secs, the total run is about 3-4hours..
The dll for al the tables Needed
USE [develop]
GO
/****** Object: Table [dbo].[ARTIKELMAGAZIJN] Script Date: 03/23/2009 12:28:02 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[ARTIKELMAGAZIJN](
[ARM_ID] [int] IDENTITY(1,1) NOT NULL,
[FAM_ID] [int] NULL,
[KKM_ID] [int] NULL,
[ARM_REMOVED] [bit] NOT NULL CONSTRAINT [DF_ARTIKELDD_ARD_REMOVED] DEFAULT ((0)),
[ARM_ARTIKEL_NR_DD] [varchar](25) COLLATE Latin1_General_BIN NOT NULL,
[ARM_ARTIKEL_OMS_DD] [varchar](80) COLLATE Latin1_General_BIN NULL,
[ARM_STOCKCONTROLE] [bit] NOT NULL,
[ARM_VERPLICHT_MIN_IN_STOCK] [int] NOT NULL,
[ARM_VERPLICHT_MAX_STOCK] [int] NOT NULL,
[ARM_AUTOMATISCH_BESTELLEN] [bit] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AUTOMATISCH_BESTELLEN] DEFAULT ((0)),
[ARM_AANTAL_IN_STOCK] [int] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AANTAL_IN_STOCK] DEFAULT ((0)),
[ARM_AANTAL_IN_BESTELLING] [int] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_AANTAL_IN_BESTELLING] DEFAULT ((0)),
[ARM_BARCODE] [int] NULL,
[ARM_LASTUPDATE] [datetime] NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_LASTUPDATE] DEFAULT (GetDate()),
[ARM_ACTIEF] [bit] NOT NULL CONSTRAINT [DF_ARTIKELMAGAZIJN_ARM_ACTIEF] DEFAULT ((1)),
CONSTRAINT [PKCU-ARTIKELMAGAZIJN-ARM_ID] PRIMARY KEY CLUSTERED
(
[ARM_ID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[KORTINGMAGAZIJN](
[KKM_ID] [int] IDENTITY(1,1) NOT NULL,
[KKM_KLASSE] [varchar](2) COLLATE Latin1_General_BIN NULL,
[KKM_PERCENT] [decimal](14, 4) NULL,
[KKM_EXTRA] [decimal](14, 4) NULL,
[KKM_TRIMESTER] [decimal](14, 4) NULL,
[KKM_ADR_ID_LEV] [int] NOT NULL,
[KKM_REMOVED] [bit] NOT NULL CONSTRAINT [DF_KORTINGMAGAZIJN_KKM_REMOVED] DEFAULT ((0)),
CONSTRAINT [PKCU-KORTINGMAGAZIJN-KKM_ID] PRIMARY KEY CLUSTERED
(
[KKM_ID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[ARTIKELLEVERANCIERMAGAZIJN](
[ALM_ID] [int] IDENTITY(1,1) NOT NULL,
[ADS_ID] [int] NOT NULL,
[ARM_ID] [int] NOT NULL,
[ALM_ARTIKEL_LEVERANCIER] [varchar](80) COLLATE Latin1_General_BIN NOT NULL,
[ALM_OPMERKING] [varchar](255) COLLATE Latin1_General_BIN NULL,
[ALM_EENHEID] [decimal](14, 4) NOT NULL,
[ALM_VERPAKKING_HOEVEELHEID] [int] NULL,
[ALM_MIN_BESTEL_HOEVEELHEID] [int] NULL,
[ALM_MUNTCODE] [varchar](3) COLLATE Latin1_General_BIN NOT NULL,
[ALM_BRUTO_PRIJS_PER_EENHEID] [decimal](14, 4) NULL,
[ALM_BRUTO_PRIJS_PER_VERPAKKING] [decimal](14, 4) NULL,
[ALM_KORTING_1] [decimal](14, 4) NULL,
[ALM_NETTO_PRIJS_1_PER_EENHEID] [decimal](14, 4) NULL,
[ALM_NETTO_PRIJS_1_PER_VERPAKKING] [decimal](14, 4) NULL,
[ALM_KORTING_2] [decimal](14, 4) NULL,
[ALM_NETTO_PRIJS_2_PER_EENHEID] [decimal](14, 4) NULL,
[ALM_NETTOPRIJS_2_PER_VERPAKKING] [decimal](14, 4) NULL,
[ALM_REMOVED] [bit] NOT NULL CONSTRAINT [DF_ARTIKELLEVERANCIERMAGAZIJN_ALM_REMOVED] DEFAULT ((0)),
[ALM_EENHEIDCODE] [int] NOT NULL,
CONSTRAINT [PKCU-ARTIKELLEVERANCIERMAGAZIJN-ALM_ID] PRIMARY KEY CLUSTERED
(
[ALM_ID] ASC
)WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] WITH NOCHECK ADD CONSTRAINT [FK_ARTIKELMAGAZIJN_FAMILIE_OP_FAM_ID] FOREIGN KEY([FAM_ID])
REFERENCES [dbo].[FAMILIE] ([FAM_ID])
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] CHECK CONSTRAINT [FK_ARTIKELMAGAZIJN_FAMILIE_OP_FAM_ID]
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] WITH NOCHECK ADD CONSTRAINT [FK_ARTIKELMAGAZIJN_KORTINGMAGAZIJN_OP_KKM_ID] FOREIGN KEY([KKM_ID])
REFERENCES [dbo].[KORTINGMAGAZIJN] ([KKM_ID])
GO
ALTER TABLE [dbo].[ARTIKELMAGAZIJN] CHECK CONSTRAINT [FK_ARTIKELMAGAZIJN_KORTINGMAGAZIJN_OP_KKM_ID]
GO
ALTER TABLE [dbo].[ARTIKELLEVERANCIERMAGAZIJN] WITH CHECK ADD CONSTRAINT [FK_ARTIKELLEVERANCIERMAGAZIJN_ARTIKELMAGAZIJN_OP_ARM_ID] FOREIGN KEY([ARM_ID])
REFERENCES [dbo].[ARTIKELMAGAZIJN] ([ARM_ID])
Inserting sample data into KORTINGMAGAZIJN, on witch resided the values in % to take account for
INSERT INTO [KORTINGMAGAZIJN] (
[KKM_KLASSE],
[KKM_PERCENT],
[KKM_EXTRA],
[KKM_TRIMESTER],
[KKM_ADR_ID_LEV],
[KKM_REMOVED]
) SELECT 'A1','18.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'A2','15.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'B1','41.0000','0.0000','5.7500','21320','0' UNION ALL
SELECT 'B2','42.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'B3','34.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'D1','26.0000','4.0000','9.0000','21320','0' UNION ALL
SELECT 'D2','29.0000','16.0000','9.0000','21320','0' UNION ALL
SELECT 'D3','45.0000','0.0000','9.0000','21320','0' UNION ALL
SELECT 'D4','27.0000','0.0000','9.0000','21320','0' UNION ALL
SELECT 'F1','40.0000','0.0000','12.7500','21320','0' UNION ALL
SELECT 'F2','44.0000','0.0000','12.7500','21320','0' UNION ALL
SELECT 'F3','48.0000','0.0000','12.7500','21320','0' UNION ALL
SELECT 'F4','52.0000','0.0000','12.7500','21320','0' UNION ALL
SELECT 'K0','38.0000','12.0000','9.0000','21320','0' UNION ALL
SELECT 'K1','43.0000','12.0000','9.0000','21320','0' UNION ALL
SELECT 'K2','48.0000','7.0000','9.0000','21320','0' UNION ALL
SELECT 'K3','40.0000','10.0000','9.0000','21320','0' UNION ALL
SELECT 'L1','25.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'M','14.0000','0.0000','9.0000','21320','0' UNION ALL
SELECT 'M1','25.0000','0.0000','5.7500','21320','0' UNION ALL
SELECT 'M2','30.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'M3','34.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'N','0.0000','0.0000','0.0000','21320','0' UNION ALL
SELECT 'S1','12.0000','0.0000','0.0000','21320','0'
The script i use to read out the textile and perform the operations
sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
sp_configure 'Ad Hoc Distributed Queries', 1
GO
RECONFIGURE
GO
USE [develop]
GO
PRINT 'start'
PRINT GETDATE()
if object_id('tempdb..#T') is not null
drop table #T
Create table #T
(
val varchar(135),
ID int IDENTITY(1, 1)
NOT NULL
)
insert into #T
select *
from openrowset('MSDASQL',
'Driver={Microsoft Text Driver (*.txt; *.csv)};
DEFAULTDIR=C:\test; Extensions=TXT;', 'Select * from SXTDNE_MB.txt')
declare @teller int,
@fieldc INTEGER,
@vwaarde varchar(max),
@datum DATETIME,
@familieID INT,
@KKM_ID INT,
@artnr VARCHAR(19),
@tempwaarde VARCHAR(19),
@k INT,
@j-2 INT,
@l INT,
@BrutoEuro DECIMAL(14,4),
@ARM_ID INT,
@ALM_ID INT,
@ADS_ID INT
Set @teller = 1 ;
select @fieldc = MAX([ID]) FROM #T
SELECT @ADS_ID = 25859, @familieID = 3
While @teller <= @fieldc
BEGIN
SELECT @k = 2,
@j-2 = 0,
@l = 0
select @vwaarde = val from #T where ID = @teller
SELECT @tempwaarde = substring(@vwaarde, 1, 19)
-- genereren va Mercedes nummer --
SELECT @artnr = 'MB '
WHILE @l <= 19
BEGIN
IF @j-2 < 2
BEGIN
SELECT @artnr = @artnr + SUBSTRING(@tempwaarde, @k, 3)
SELECT @l = @l + 3,
IF @j-2 = 1
SELECT @artnr = @artnr + '.'
END
ELSE
IF substring(@tempwaarde, @k, 1) <> ''
BEGIN
SELECT @artnr = @artnr + '.'
+ SUBSTRING(@tempwaarde, @k, 2)
SELECT @l = ( @l + 2 ),
END
ELSE
SELECT @l = 20
END
--PRINT @artnr
-- einde genereren va Mercedes nummer --
BEGIN TRY
SELECT @BrutoEuro = (CAST(substring(@vwaarde, 114, 10) AS DECIMAL(14,4)) / 100)
SELECT @KKM_ID = KKM_ID FROM dbo.KORTINGMAGAZIJN WHERE KKM_KLASSE = substring(@vwaarde, 124, 2)
SELECT @ARM_ID = ARM_ID FROM [ARTIKELMAGAZIJN] WHERE [ARM_ARTIKEL_NR_DD] = @artnr
IF ISNULL(@ARM_ID,0) > 0
UPDATE develop.dbo.[ARTIKELMAGAZIJN]
SET KKM_ID = @KKM_ID,
ARM_LASTUPDATE = GETDATE()
WHERE [ARM_ID] = @ARM_ID
ELSE
BEGIN
INSERT INTO develop.dbo.[ARTIKELMAGAZIJN]
(
[FAM_ID],
[KKM_ID],
[ARM_REMOVED],
[ARM_ARTIKEL_NR_DD],
[ARM_ARTIKEL_OMS_DD],
[ARM_STOCKCONTROLE],
[ARM_VERPLICHT_MIN_IN_STOCK],
[ARM_VERPLICHT_MAX_STOCK],
[ARM_AUTOMATISCH_BESTELLEN],
[ARM_AANTAL_IN_STOCK],
[ARM_AANTAL_IN_BESTELLING],
[ARM_BARCODE],
[ARM_LASTUPDATE],
[ARM_ACTIEF]
)
VALUES (
@familieID,
@KKM_ID,
0,
@artnr,
substring(@vwaarde, 64, 25),
0,
0,
0,
0,
0,
0,
0,
GETDATE(),
1
)
SELECT @ARM_ID =SCOPE_IDENTITY()
END
-- PRINT CONVERT(VARCHAR(12),@ARM_ID)
-- bepalen van de prijzen
DECLARE @BrutoprijsPerVerpakking DECIMAL(14,4),
@Nettotprijs1perEenheid DECIMAL(14,4),
@Nettotprijs1perverpakking DECIMAL(14,4),
@Nettotprijs2perEenheid DECIMAL(14,4),
@Nettotprijs2perverpakking DECIMAL(14,4),
@korting1 DECIMAL(14,4),
@korting2 DECIMAL(14,4),
@verpakkingsEenheid INT
SELECT @ALM_ID = ALM_ID, @verpakkingsEenheid = [ALM_VERPAKKING_HOEVEELHEID] FROM develop.dbo.[ARTIKELLEVERANCIERMAGAZIJN] AS a WHERE [ARM_ID] = @ARM_ID AND [ADS_ID] = @ADS_ID
SELECT @BrutoprijsPerVerpakking = @BrutoEuro
IF ISNULL(@KKM_ID,0) > 0
SELECT @korting1 = KKM_PERCENT, @korting2= KKM_EXTRA FROM [KORTINGMAGAZIJN] WHERE KKM_ID = @KKM_ID
--PRINT 'KKMID=: ' + CONVERT(VARCHAR(12),@KKM_ID) + 'ALM_ID=: ' + CONVERT(VARCHAR(12),@ALM_ID)
SELECT @Nettotprijs1perverpakking = @BrutoprijsPerVerpakking - (@BrutoprijsPerVerpakking * (ISNULL(@korting1,0) / 100))
SELECT @Nettotprijs1perEenheid = (@Nettotprijs1perverpakking / ISNULL(@verpakkingsEenheid,1))
-- netto prijs verpakking2 = bruto - 1ékorting en - 2é korting erbij
IF ISNULL(@korting2,0) > 0
BEGIN
SELECT @Nettotprijs2perverpakking = @Nettotprijs1perverpakking - (@Nettotprijs1perverpakking *(@korting2 / 100))
SELECT @Nettotprijs2perEenheid = ( @Nettotprijs2perverpakking / ISNULL(@verpakkingsEenheid,1) )
END
ELSE
SELECT @Nettotprijs2perEenheid = 0, @Nettotprijs2perverpakking = 0
-- einde prijzen
-- invoegen of bijwerken van artikelleveranciermagazijn
IF ISNULL(@ALM_ID,0) > 0
-- update
UPDATE [ARTIKELLEVERANCIERMAGAZIJN]
SET [ALM_ARTIKEL_LEVERANCIER]= @artnr,
[ALM_OPMERKING]= substring(@vwaarde, 64, 25),
[ALM_BRUTO_PRIJS_PER_VERPAKKING] = @BrutoprijsPerVerpakking,
[ALM_KORTING_1] = @korting1,
[ALM_NETTO_PRIJS_1_PER_EENHEID] = @Nettotprijs1perEenheid,
[ALM_NETTO_PRIJS_1_PER_VERPAKKING] = @Nettotprijs1perverpakking,
[ALM_KORTING_2] = @korting2,
[ALM_NETTO_PRIJS_2_PER_EENHEID] = @Nettotprijs2perEenheid,
[ALM_NETTOPRIJS_2_PER_VERPAKKING] = @Nettotprijs2perverpakking
WHERE [ALM_ID] = @ALM_ID
ELSE
BEGIN
-- insert
INSERT INTO [ARTIKELLEVERANCIERMAGAZIJN] (
[ADS_ID],
[ARM_ID],
[ALM_ARTIKEL_LEVERANCIER],
[ALM_OPMERKING],
[ALM_EENHEID],
[ALM_VERPAKKING_HOEVEELHEID],
[ALM_MIN_BESTEL_HOEVEELHEID],
[ALM_MUNTCODE],
[ALM_BRUTO_PRIJS_PER_EENHEID],
[ALM_BRUTO_PRIJS_PER_VERPAKKING],
[ALM_KORTING_1],
[ALM_NETTO_PRIJS_1_PER_EENHEID],
[ALM_NETTO_PRIJS_1_PER_VERPAKKING],
[ALM_KORTING_2],
[ALM_NETTO_PRIJS_2_PER_EENHEID],
[ALM_NETTOPRIJS_2_PER_VERPAKKING],
[ALM_REMOVED]
) VALUES (
/* ADS_ID - int */ @ADS_ID,
/* ARM_ID - int */ @ARM_ID ,
/* ALM_ARTIKEL_LEVERANCIER - varchar(80) */ @artnr,
/* ALM_OPMERKING - varchar(255) */ substring(@vwaarde, 64, 25),
/* ALM_EENHEID - int */ 1,
/* ALM_VERPAKKING_HOEVEELHEID - int */ 1,
/* ALM_MIN_BESTEL_HOEVEELHEID - int */ 1,
/* ALM_MUNTCODE - varchar(3) */ 'EUR',
/* ALM_BRUTO_PRIJS_PER_EENHEID - decimal(14, 4) */ @BrutoEuro,
/* ALM_BRUTO_PRIJS_PER_VERPAKKING - decimal(14, 4) */ @BrutoprijsPerVerpakking,
/* ALM_KORTING_1 - decimal(14, 4) */ @korting1,
/* ALM_NETTO_PRIJS_1_PER_EENHEID - decimal(14, 4) */ @Nettotprijs1perEenheid,
/* ALM_NETTO_PRIJS_1_PER_VERPAKKING - decimal(14, 4) */ @Nettotprijs1perverpakking,
/* ALM_KORTING_2 - decimal(14, 4) */ @korting2,
/* ALM_NETTO_PRIJS_2_PER_EENHEID - decimal(14, 4) */ @Nettotprijs2perEenheid,
/* ALM_NETTOPRIJS_2_PER_VERPAKKING - decimal(14, 4) */ @Nettotprijs2perverpakking,
/* ALM_REMOVED - bit */ 1)
SET @ALM_ID = scope_identity()
--PRINT 'ArtikelleverancierMagazijn aangemaakt voor product ' + @artnr + ' nieuwid: ' + CONVERT(VARCHAR(12),@ALM_ID)
END -- einde invoegen artikelleveranciermagazijn
END TRY
BEGIN CATCH
PRINT @teller
PRINT @vwaarde
PRINT ERROR_MESSAGE()
SET @teller = 1000000
END CATCH
PRINT CONVERT(VARCHAR(12), @teller) + ' - ' + @artnr
set @teller = @teller + 1
END
if object_id('tempdb..#T') is not null
drop table #T
PRINT 'einde'
PRINT GETDATE()
GO
USE [master]
GO
sp_configure 'Ad Hoc Distributed Queries', 0
GO
RECONFIGURE
GO
I you could help me in putting this nicer together in a t-sql way, i could have a nice startground for my other procedures also.
And that would defenitiley be the goal.
Tnx in advance already.
Wkr,
Eddy
Ps: i have tried as in the best practises suggested to perform this on the data in the temp table but that does not work:
SELECT 'SELECT '
+QUOTENAME(val,'''')+','
+QUOTENAME(ID,'''')
+ ' UNION ALL'
FROM [#T]
It retruns null, dunno why cause when i do it on the table kortingmagazijn it works well ,So i include a little test data with a text file like i read it in,
Ps: is someone could come up with a SSIS solution for this or semething alike, that would be appreciaed also cause it would show ppl how to do different steps in SSIS and still combine all at the end,
I have tried it but as i said, im to novice in it,
March 24, 2009 at 10:26 am
Most of the time I see this it is memory usage on client side. Network and SQL is usually fast on server side. Your app may show constant memory, but memory may not be reclaimed via .net gargage collection. Test this by breaking file into 4 parts and running update 4 times if the sum of 4 parts is much less than 1 file, its a client side problem.
To check SQL since run a trace and check the SQL side.
For the fastest upload, perform the bulk inserts to a transaction table in your application DB. Then build a couple of procs to (a) insert NEW records using a query, (b) update existing records using a query. Write messages to a log table so that you can query the log table when your done.
By using this method the solution will be ALL sql and you will eliminate possible issues with client side memory and network slowdown. By Keeping a log table and transaction table you can debug past errors in your logic.
Good luck
March 24, 2009 at 2:59 pm
What happens to the application performance if you re-enable (or create?) an index on the ARM_ARTIKEL_NR_DD field on [dbo].[ARTIKELMAGAZIJN] ? You're probably causing a full table scan to find out @ARM_ID every time you execute sproc_tmpImportMercedesArtikelen.
Regards,
Martin
April 2, 2009 at 12:33 am
I have to admit that I have not read all the pots...and maybe I'm way off....but....
I had same problem, only I have been using SQL stored procedure to do some massive calculations and inserts...maybe your problem is connected with file growth? Because you are inserting many rows you need to reconfigure file growth....it helped me...
April 2, 2009 at 11:40 am
Hi,
Re-creating the index did not really make a real difference, altough its a bit faster,
But you were right about the table scans.,
File growth is not the issue, Database is configured to grow in 500Mb Blocks at once if needed, Log file is configured to grow with 200Mb at once if needed,
Bi-ut neither of them have occured (database has a small Gig of free space (proactive growth)
Log file is truncated every 15' due to transaction log backup, has never reached 30% of his size during the load.
I think i have pinned the issue cause my import program is disposing his connection object after every execute of the stored proc.
Did not notice this right away, will try next week whether i can sole it this way
Not using dispose, but just a connection.close should be sufficient i guess
That way if i have read my documentation correctly i would be able to re-use the connection-pipe .?
Tnx for all the feedback in this issue
Wkr,
Eddy
April 9, 2009 at 2:47 pm
Eddy, you should absolutely leave open your connection object and read up on dispose, finalize and garbage collection. I would also read up on execution plans for stored procs and use profiler to look for recompiles of the proc. It doesn't work like recompiles of .NET code.
I don't have a link to explain it, but here's my understanding of the process: If you have a big proc with lots of IF/Then logic you may inadvertenly cause a recompile because your data is branching to a part of the logic that was not executed the last time the proc was executed.
I think your problem is mostly client side, so I'd still try to split the data into 4 parts and just for fun, see if the 4 parts run significally faster that 1 big run. Also, some visual controls try to show the last item in the list, so you may be scrolling messages in a listbox or command window without knowing it. For ultimate clients side debugging try this. Watch the 1 hour screen cast, it's truely amazing.
http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
Good luck
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply