August 10, 2012 at 12:03 pm
I am too cheap to get some actual testing software. What I want to do is write something that will force SQL to use all available memory.
I created a test database with a table that has more than a million rows. The table has the following fields: date/time, int, float, varchar(50). All fields have different, random data in them.
To hammer the disks, I created a VBS script that, when execute, will spawn a number of VBS scripts that will insert, delete, and select data from this table. The number of scripts it spawns and the length of time it runs are parameters. While this will push the disk and processor when I set it to 500 instances, it doesn't real tax the memory. As a side note - the processor spikes are more from the number of wscript.exe processes than anything else.
I can execute a query through this script, so now I am trying to find a query that is memory intensive so I can run it a couple hundred times at the same time.
So here is everyone's chance to provide examples of queries that kill the memory.
The underlying reason for doing this is to test if VM will actually release the memory when the balloon driver claims it. In Windows 2008R2 on VM, the VM will take any memory that is not being utilized to used for other VM instances. This makes it look like the memory is at 97% on my SQL instance. The sysadmin doesn't want to reserve memory for the VM instances. I just don't like my memory up there at 97%. How can I tell if I'm having memory issues when it always looks like I'm having memory issue?
So I want to see if the VM will actually give me my memory back when I need it. Hence the request for everyone to start writing some very bad queries that will kill the memory.
Thank you.
...
August 10, 2012 at 1:26 pm
Easiest way, get a DB that's > available memory and run a loop querying every table. You want to fill the data cache, not hammer the CPU, so you just need lots of data.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 10, 2012 at 1:32 pm
Here is the table:
CREATE TABLE [dbo].[xxxLoadSimTable](
[datetimedata] [datetime] NULL,
[intdata] [int] NULL,
[floatdata] [float] NULL,
[varchardata] [varchar](50) NULL
)
Here is a sample line of data:
datetimedataintdatafloatdatavarchardata
2012-10-08 11:26:36.903216216.201080970516cuH26-bHR'=WbPhH9E`t1T=KTNVQa8>W#`jpKJ+Y;8kw^+|KZj
...
August 10, 2012 at 1:32 pm
Or, you could just use the default max memory setting that comes with a fresh install of SQL Server and let the server grab all the memory VM has to offer (and possibly throw in all those queries too).
August 10, 2012 at 2:41 pm
True. It's the amount of data that we're putting into memory that matters, not the amount of times that we're querying the data.
With that in mind, I've created a large table:
CREATE TABLE [dbo].[TestTable](
[datetimedata] [datetime] NULL,
[intdata] [int] NULL,
[floatdata] [float] NULL,
[chardata] [varchar](8000) NULL,
[textdata] [text] NULL
)
The RBAR queries will cause the processor to go nuts, but doesn't do anything for memory.
Still testing...
...
August 10, 2012 at 3:15 pm
The varchar and the text do nothing unless populated.
CREATE TABLE [dbo].[TestTable](
[datetimedata] [datetimeoffset] not null default getdate(),
[intdata] [bigint] not null default 0,
[floatdata] numeric(38,0) not null default 0,
[chardata] [char](7500) not null default ' ',
[textdata] nvarchar(max) not null default replicate(50000,cast(' ' as varchar(max))
)
Untested.
I make that about 108kb per row
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply