November 16, 2022 at 5:33 pm
Hi there
We are working on Azure SQL database through the Free trial subscription.
The spec of the database is single database of 10DTU’s and 250GB
we have noticed performance bottlenecks on this database running a script.
I then queried the top 100 records on a table in Azure and it was taking 5 mins to return results.
the same table when queried on my local instance was taking milliseconds
is there a peformance bottleneck on the Azure free trial?
I ran the following script as a test on both my local instance and Azure free trial
=========================================================================================
DECLARE @random FLOAT
DECLARE @Upper FLOAT
DECLARE @Lower FLOAT
DECLARE @FromDate DATETIME2(2)
DECLARE @ToDate DATETIME2(2)
DECLARE @QueryStartTime DATETIME2(2)
DECLARE @QueryFinishTime DATETIME2(2)
SET @QueryStartTime = GetDate()
/*************************************************************************************************
CREATE TEMP TABLE
*************************************************************************************************/
SET NOCOUNT ON
DROP TABLE IF EXISTS #ChannelReading
CREATE TABLE #ChannelReading (
[ReadingDateTime] [datetime2](7) NULL,
[SIReading] [real] NULL,
[RawReading] [int] NULL
) ON [PRIMARY]
/*************************************************************************************************
DECLARE AND SET LOCAL VARIABLES
*************************************************************************************************/
-- Extract out the last date entry from table
----Set range for Date Values to be inserted
SET @FromDate = '2022-06-13 10:11:53.00'
SET @ToDate = '2022-06-14 20:11:53.00'
SET @Lower = 21.90172 ---- The lowest random number
SET @Upper = 37.35857 ---- The highest random number
/****************************************************************************************************************************/
/* EXTRACT DATE, LOWER AND UPPER FROM EXISTING TABLE */
/****************************************************************************************************************************/
SELECT @FromDate as FromDate,@ToDate as ToDate, @Lower as [Lower], @Upper as [Upper]
/*************************************************************************************************
NOW LOOP THROUGH THE INTERVALS BETWEEN THE FROMDATE AND TODATE
AND INSERT RECORDS (IN 1 SECOND INTERVALS) WITH A RANDOMIZED TEMPERATURE
READING.
*************************************************************************************************/
WHILE @FromDate <= @ToDate
BEGIN
-- Increment date value by 1 second
SET @FromDate = DateAdd(SECOND,1,@FromDate)
-- Now create a random temperature value between the bounds specified
SELECT @random = ((@Upper - @Lower -1) * RAND() + @Lower)
--Now insert the record in
INSERT INTO #ChannelReading([ReadingDateTime],[SIReading])
VALUES (@FromDate,@Random)
END
--SET NOCOUNT OFF
select count(1) as BaseReadingCount from #ChannelReading
select * from #ChannelReading
SET @QueryFinishTime = GetDate()
select datediff(s, @QueryStartTime,@QueryFinishTime) as [QueryExecutionTime(seconds)]
===================================================================================
The above script executed as follows:
Local Instance - 8 seconds
Our database server - 13 seconds
Azure - 36 seconds
Now this is a massive jump from a local instance running on my laptop (core i5, 8 gb RAM) to
a database on Azure free trial
November 16, 2022 at 7:28 pm
An Azure Database with 10 DTU's is the second lowest size available, up the size and try again. Where are you running the query from, Azure portal or local on your system with SSMS. If local you also potentially may have network latency involved. Azure databases have many different sizes (and cost) available. Typically a local Sql Server is sized a lot larger because it hosts multiple databases which share the performance, while in Azure the performance is based on the database level. A nice thing is that the database compute level can be easily adjusted up and down in Azure - I have a couple things I do where I bump the size up for an hour or so to accomplish something quicker then adjust it back down when not much is going on.
November 16, 2022 at 8:36 pm
An Azure Database with 10 DTU's is the second lowest size available, up the size and try again.
He's using the "free trial" version... can you actually up the DTUs there? I'm thinking, probably not.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 16, 2022 at 8:37 pm
November 16, 2022 at 8:38 pm
Now this is a massive jump from a local instance running on my laptop (core i5, 8 gb RAM) to a database on Azure free trial
Welcome to the world of running on "OPS", the clean version of which is "Other People's Systems".
Also, if the looping code you posted is going to be used for anything real, there's a whole lot faster method.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 17, 2022 at 5:18 pm
Hi guys
thanks for your responses on this
We were running the query on SSMS connected to Azure
Now we done some testing with our deployment. We deployed out of 'Net Rider using SQL Package.
Now the database we were deploying was using <SqlServerVersion>Sql150</SqlServerVersion>
this needed to be changed to <SqlServerVersion>SqlAzure</SqlServerVersion>
Also we needed to change the compatiblity level on the Azure Database to 160 (SQL Server 2022)
We then noticed an improvement in performance on our application
whats also interesting to note.... we do a top 100 Select out of a local Audit table
That table has a NVARCHAR(max) on the table.
Include that and Select top 100 takes 3 mins to run
Remove the NVARCHAR(MAX) column and the query returns results instantly!
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply