December 14, 2009 at 4:15 am
Hi Experts,
While running DBCC Checkdb on a server im getting the following error.
"A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)".
My Ms SQl server 2005 has just one user database which is of 300Gb.
Please help.
Tanx 😀
December 14, 2009 at 7:16 am
That error seems to be connection lost from the Server, have you re run the DBCC command again?
Blog -- LearnSQLWithBru
Join on Facebook Page Facebook.comLearnSQLWithBru
Twitter -- BruMedishetty
December 14, 2009 at 9:14 am
Looks like the server went off the network or shut down. Check that the server's running, check that SQL's running and try running checkDB again. This is not a checkDB error, it's a generic 'server gone away' error.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 14, 2009 at 10:21 pm
The server didn't go offline also was not restarted i checked the logs.
When i run DBCC Checkdb manually on the 300Gb database the process stays in executing state without give any messages.
I ran it for around 30min.
Is there any other go to break done the DBcc checkdb command to say table level or page level etc.
On an avg can you please tell me how many Hrs does dbcc Chekdb take to check a 300Gb DB.
Tanx 😀
December 15, 2009 at 2:04 am
Eswin (12/14/2009)
The server didn't go offline also was not restarted i checked the logs.
Then there was some kind of network glitch between the client and server. The error is a connection-related error. It's saying that, from the client, it cannot identify or locate the server.
Is there any other go to break done the DBcc checkdb command to say table level or page level etc.
You can run checkfilegroup or checktable. If you do the latter, you'll need to run checkalloc and checkcatalog as well
On an avg can you please tell me how many Hrs does dbcc Chekdb take to check a 300Gb DB.
Impossible to say. Depends on the IO subsystem, other activity, etc. Half an hour doesn't sound that long though.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 15, 2009 at 7:52 am
See CHECKDB From Every Angle: How long will CHECKDB take to run? and CHECKDB From Every Angle: Consistency Checking Options for a VLDB.
Seems like you're missing an 'h' from Thanx too 🙂
Paul Randal
CEO, SQLskills.com: Check out SQLskills online training!
Blog:www.SQLskills.com/blogs/paul Twitter: @PaulRandal
SQL MVP, Microsoft RD, Contributing Editor of TechNet Magazine
Author of DBCC CHECKDB/repair (and other Storage Engine) code of SQL Server 2005
December 19, 2009 at 11:16 pm
Hi
I used DBCC CHECKDB WITH ESTIMATEONLY
and found that around 2Gb tempdb space needed to run DBCC CHECKDB.
Then i performed DBCC CHECKDB WITH PHYSICAL_ONLY on that 300GB DB. It took around 30min to complete without any error in staging server.
Can you advice me on how to go about after this.
Can you also please tell me how to find the largest tables (by number of pages).
My database has around 9000 User tables.
Tanx 😀
December 20, 2009 at 1:37 am
Eswin (12/19/2009)
HiI used DBCC CHECKDB WITH ESTIMATEONLY
and found that around 2Gb tempdb space needed to run DBCC CHECKDB.
Then i performed DBCC CHECKDB WITH PHYSICAL_ONLY on that 300GB DB. It took around 30min to complete without any error in staging server.
Can you advice me on how to go about after this.
Run checkDB, on the production database, to completion. 30 min is not a long time for CheckDB (I've seen it run for 5 hours on a 1TB database). Make sure that you've got a fair bit of maintenance time so if it takes longer than on staging it won't be a problem.
Or, if you want to do it piecemeal, run CheckAlloc, CheckCatalog and CheckTable on all tables.
Either way, let the checks run to completion. There's no point in running checkDB and stopping it half way through, it produces no information.
It could be that the longer run time is because there's corruption. Or it may be because of higher activity, more IO contention or a number of other factors
Can you also please tell me how to find the largest tables (by number of pages).
My database has around 9000 User tables.
Query sys.dm_db_index_physical_stats
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 21, 2009 at 3:26 am
Hi Eswin,
Here is a query to collect details of all tables, their data/index space usage and % space they occupy in a database. Check out if this can be of any use.
-- Create the temp table for further querying
CREATE TABLE #temp(
rec_id int IDENTITY (1, 1),
table_name varchar(128),
nbr_of_rows int,
data_space decimal(15,2),
index_space decimal(15,2),
total_size decimal(15,2),
percent_of_db decimal(15,12),
db_size decimal(15,2))
-- Get all tables, names, and sizes
EXEC sp_msforeachtable @command1="insert into #temp(nbr_of_rows, data_space, index_space) exec sp_mstablespace '?'",
@command2="update #temp set table_name = '?' where rec_id = (select max(rec_id) from #temp)"
-- Set the total_size and total database size fields
UPDATE #temp
SET total_size = (data_space + index_space), db_size = (SELECT SUM(data_space + index_space) FROM #temp)
-- Set the percent of the total database size
UPDATE #temp
SET percent_of_db = (total_size/db_size) * 100
-- Get the data
SELECT *
FROM #temp
ORDER BY total_size DESC
-- Comment out the following line if you want to do further querying
DROP TABLE #temp
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply