February 13, 2009 at 5:12 am
Please, I have a problem with a database which is getting out of space, 300 Mb free over 40 Gb of total space. I've tryed to add a new datafile, but I get the following error:
Could not allocate space for object 'sys.sysfiles1'.'sysfiles1' in database 'HsWebLaboratorio' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. (Microsoft SQL Server, Error: 1105)
I have manually grown by 100 Mb the only datafile with 200 Mb of free space remaining, I've grown the master database. The database has only one primary group, and 8 datafiles,
2 of 7000 Mb and the others of 4000 (max autogrowth size reached by all, but one).
I prefer non to increase the 4000 Mb limit, just to keep the files at a lower dimension.
What can I do?
Thanks in advance,
sb
February 13, 2009 at 5:35 am
Are you sure this is the message you are getting when trying to add a datafile?
...Could not allocate space for object 'sys.sysfiles1'.'sysfiles1' in database 'HsWebLaboratorio' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. (Microsoft SQL Server, Error: 1105)
...
That specific error appears to be related to the fact you are running out of space, not to the act of adding a new datafile.
Could you please double-check?
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.February 13, 2009 at 5:39 am
this is what I was thinking, the message is exactly the reason why I need to add the file!
To be sure, I've tryed again, the message is:
TITLE: Microsoft SQL Server Management Studio
------------------------------
Alter failed for Database 'HsWebLaboratorio'. (Microsoft.SqlServer.Smo)
------------------------------
ADDITIONAL INFORMATION:
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
------------------------------
Could not allocate space for object 'sys.sysfiles1'.'sysfiles1' in database 'HsWebLaboratorio' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. (Microsoft SQL Server, Error: 1105)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=1105&LinkId=20476
------------------------------
BUTTONS:
OK
------------------------------
February 13, 2009 at 5:50 am
Looks like you have actually ran out of space. You can't even insert a row on sys.sysfiles1 when trying to add a datafile.
I would let some datafile to grow a little, then add the datafile.
You can also try DBCC and check if there is any allocation issue, may be SQL Server "thinks" you have no space at all but some space is still there for you to use.
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.February 13, 2009 at 6:01 am
Thanks for suggestion, I've grown an existing datafile from 4000 to 4500, autogrowth to 5000 mb. this worked. But then, retrying to add a datafile of 500 Mb, I got the same error. Is there maybe a limit of dimension or datafiles a filegroup can have?
Running the query
SELECT @dbname,
DB_Total_Size_in_MB = sum(TotalExtents1)*65536.0/1048576.0,
DB_Used_Size_in_MB = sum(UsedExtents1)*65536.0/1048576.0,
DB_Free_Size_in_MB = (sum(TotalExtents1)-sum(UsedExtents1))*65536.0/1048576.0, --sum(TotalExtents1-UsedExtents1)*65536.0/1048576.0,
DB_Percent_Used = (sum(UsedExtents1)/sum(TotalExtents1))*100 --sum(UsedExtents1/TotalExtents1)*100
FROM #dbcc_showfilestats
there is free space, and there was even initially 150 mb of free space totally in the database, excluded the 200 mb still free for autoextend.
So what can be the matter?
February 13, 2009 at 6:05 am
Are you positive you have just one filegroup? -PRIMARY ?
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.February 13, 2009 at 6:10 am
yes, I am. I think all of our databases have only one filegroup=primary...
February 13, 2009 at 6:49 am
Silly question.
Have you checked your hard drive available space yet?
The biggest problem I see regarding this, especially when the error concerns the PRIMARY filegroup, is that either the logical drive containing the data files doesn't have enough free space or that the drive containing the SQL Server program files on the server is getting full.
February 16, 2009 at 3:45 am
Thanks, it's the first think I checked. Disk space is not a problem, I have about 100 gb for datafiles and about 11 Gb on Sql Server unit.
The question about having only a primary filegroup made me think. Since I'm not very expert yet in managing databases, I knew it is recommended to split logically different data (according with the programmers) into several filegroups, but are there problems keeping only one?
Back to my problem, if I grow existing datafiles, I get no problems. I can't manage to add a new one. I can solve the need at the moment, the matter is to keep the datafiles possibly under 5 Gb.
once again thanks,
sb
February 16, 2009 at 5:37 am
Question, what drive(s) are your system DBs on?
Also, does your Model db have a max file size?
Is your TempDB maxed out when you try this?
EDIT: I should clarify why I'm asking these questions. I believe your system dbs are on a different drive and that drive happens to be getting full. The Model DB is used when creating / altering DBs, so if that database is on a drive that is "full" or near to full, then it might be causing your issue. Especially if the file you're adding to your user DB is bigger than the space available to the Model DB.
It's a long shot, but it's worth checking.
February 16, 2009 at 5:56 am
Question, what drive(s) are your system DBs on?
system DBs are on the same drive of non system DBs, it's a SAN disk.
transaction logs are on a different drive.
Also, does your Model db have a max file size?
All of system DBs have unlimited file size
Is your TempDB maxed out when you try this?
I do not think so, since its dimension at the moment is 8 Mb for data, 1 Mb for transaction log, it does'nt seem having tried to grow that much...
Talking about tempdb, what about a supplier asking me to run following code at startup in tempdb
GRANT REFERENCES ON SCHEMA :: dbo TO guest
to read users defined datatypes?
I'm not willing to allow this grant, that way any user could read everyone's schemas objects passing in tempdb, isn't it?
In a few minutes we will reboot the Sql server, I hope the problem solves by itself...
any further suggestion will be very appreciated...
sb
😉
February 16, 2009 at 6:02 am
Have you checked the drive the transaction logs are on to make sure it isn't full?
Are you backing up your transaction logs (and checkpointing / truncating them) on a regular basis?
As far as granting references on TempDB, ask your supplier why he would want references on TempDB. There's no pass through as far as I know, but there's also no reason for anyone to need access to TempDB. It sounds like a silly request and I would want more detail on why this would be important to him before I'd even consider the request.
February 16, 2009 at 6:04 am
GRANT REFERENCES ON SCHEMA :: dbo TO guest
Guest User account is a dangerous account. I would not grant any permissions to it. Why dont you create a login for this and assign permissions to it individually?
February 16, 2009 at 6:09 am
BTW, as TempDB is essentially recreated every time you restart the services, what user defined datatypes are going to exist in it?
Does your supplier actually have code modifying TempDB?
That sounds alarm bells in my head. Like the supplier doesn't have a clue. I could see user defined datatypes in Model, but not in TempDB.
February 16, 2009 at 6:16 am
the drive the transaction logs are on has enough free space.
I actually backup regularly only non system DBs transaction logs, system DBs have simple recovery model (but msdb, I've noticed now, which has full recovery model without having the transaction log backed up, I dont' know why, it's a previous configuration made by someone else, maybe I'd better change it)
As regards tempDb, after a lot of questions, the only explaination the supplier has given me is that they need to give the guest user the right to read their defined data types. I think user defined datatypes are not visible in temp tables, this is probably the matter. the problem is that on the same instance we have already many other DBs of other suppliers. Could it create problems with security?
Viewing 15 posts - 1 through 15 (of 36 total)
You must be logged in to reply to this topic. Login to reply