MDF file too large. PRIMARY FILE GROUP FULL error

  • I run SQL Server 2008 Express R2 on my local machine with Windows 7 Home Premium. I use the db to run a dotnetnuke 7 website. Everything went on fine until I got a "Cannot allocate space to object 'x'....PRIMARY FILE GROUP IS FULL" error message one day. I have cleared the logs from the application but the error persisted. I checked the database and found the mdf file has grown to over 10GB is size while the ldf file was around 175MB. The details of database are as follows:

    SQL Server 2008 Express R2

    Recovery Mode: Simple

    mdf file size: 10.23 GB

    Initial size: 10.23 GB (it shows the same)

    Space Remaining: 0.0 GB

    Data File autogrowth settings: by 10MB, Unrestricted Growth

    ldf file size: 173 MB

    Log File autogrowth settings: by 10% Restricted Growth

    Also when I try to shrink the file from the management studio-tasks it says cannot find such file. However the file is very well there in the path mentioned.

    Error Details

    File

    Error Index #: 0

    Source: .Net SqlClient Data Provider

    Class: 17

    Number: 1105

    Procedure: AddEventLog

    Message: System.Data.SqlClient.SqlException (0x80131904): Could not allocate space for object 'dbo.EventLog'.'PK_EventLogMaster' in database 'hescom_db' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning() at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at PetaPoco.Database.Execute(String sql, Object[] args) at DotNetNuke.Data.PetaPoco.PetaPocoHelper.ExecuteNonQuery(String connectionString, CommandType type, String sql, Object[] args) at DotNetNuke.Data.SqlDataProvider.ExecuteNonQuery(String procedureName, Object[] commandParameters) at DotNetNuke.Data.DataProvider.AddLog(String logGUID, String logTypeKey, Int32 logUserID, String logUserName, Int32 logPortalID, String logPortalName, DateTime logCreateDate, String logServerName, String logProperties, Int32 logConfigID) at DotNetNuke.Services.Log.EventLog.DBLoggingProvider.WriteLog(LogQueueItem logQueueItem)

    I am new to DB management and any help is appreciated.

    Thank You.

  • The answer is "simple". You have too much data for Express to handle and will need to delete some or move to a different edition of SQL Server.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • SQL2008 R2 Express edition database size is limited to 10GB.

    Solution?

    - split the database

    - drop/move some data

    ref: Features Supported by the Editions of SQL Server 2008 R2 http://technet.microsoft.com/en-us/library/cc645993%28v=sql.105%29.aspx#Scalability

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • Fixed it folks. I had understood that the data reached its limit but was worried because the application was too small to fill upto 10 GB and suspected something with the db. I identified the table occupying all that data and found that the website had created a lot of search index records (over a million of them). I truncated all of it and other such indices from related tables. I rebuilt the index from the application and now have over 9GB left.

    Thanks Jeff Moden and ALZDBA for your quick responses.

  • In such case, I would shrink the mdf file to an appropriate size and monitor its growth to prevent your application to go down because of the database restrictions.

    If you shrink the mdf file, keep in mind to rebuild all your indexes after the shrink operation.

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • Sure. Thanks.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply