Viewing 15 posts - 76 through 90 (of 173 total)
ps (6/16/2009)
murthykalyani (6/16/2009)
June 16, 2009 at 7:26 pm
Find out which log backup file is failing, restore that log manually with move and norecovery option, once that is restored, please restart restore job, this should fix your...
June 16, 2009 at 12:38 pm
Please post entirey history of log shipping restore job for it's last run for the database that you are having issue.
June 10, 2009 at 4:19 pm
[Quote]
I had set up the following counters and Iam posting some sample data. But I am unable to figure out that reading were <20ms?
how can I make sure it is...
June 4, 2009 at 12:29 pm
Krasavita (6/4/2009)
I have a field at datetime 1/1/2009 12:00:00 AM, how can I convert it to 1/1/2009.Thank you
If you want in US standard i.e mm/dd/yyyy then
select convert(varchar(10),'01/01/2009 12:00:00 AM',101)
if...
June 4, 2009 at 10:30 am
rambilla4 (6/3/2009)
If SAN being shared then it has to be figured out what is causing that these messages recorded in error...
June 4, 2009 at 10:25 am
rambilla4 (6/3/2009)
I have noticed the below message in the error log:
2009-05-27 20:22:24.95 spid2s SQL Server has encountered 1 occurrence(s) of I/O requests taking longer than 15 seconds to complete on...
June 3, 2009 at 4:46 pm
Given below link explains you how to migrate logins and password.
June 3, 2009 at 4:36 pm
bubbly (5/28/2009)
May 29, 2009 at 7:17 am
Here is the script which gives you all the information that you are looking for.
set nocount on
if object_id('tempdb..##temp_schedule') is not null
begin
drop table ##temp_schedule
--print '##temp_schedule table dropped'
end
if object_id('tempdb..#temp1') is not null
begin
drop...
May 20, 2009 at 6:47 pm
Can't you filter and fetch data from oracle through open query and then join required data or do you want to do join on entire data with out filtering?
May 14, 2009 at 2:52 pm
SanjayAttray (5/14/2009)
I know how to extract data from Oracle using MSDAORA or OPENQUERY but never made any joins between both data set.And really don't know how to do that.
there are...
May 14, 2009 at 2:16 pm
Because the content database size is 70 GB now and it will grow to 150GB in few months.So, creating multiple datafiles will improve performance right? thats I want accomplish.. I...
May 14, 2009 at 1:58 pm
Viewing 15 posts - 76 through 90 (of 173 total)