Viewing 12 posts - 1 through 12 (of 12 total)
Is there any reason to have the Service use a Domain account. Have the service use a local adminstrator account.
June 6, 2006 at 7:38 am
Is @VAR2 a field in TableA? Are you trying to return one record set? If TableA and TableB are alike you could union the results together.
June 6, 2006 at 6:57 am
EXEC sp_changeobjectowner 'imported',dbo
This will change the owner.
August 26, 2004 at 7:19 am
Create the DTS package and create variable for the file name. You can use a stored procdure to run the DTS package, passing the name of the file in. A...
August 26, 2004 at 7:11 am
If you use a temp table it has to be Global temp table.
They way we do this is, we built a stored procedure that creates the table, then we use...
August 26, 2004 at 7:01 am
You could create a table like this.
CREATE TABLE [dbo].[DTSPackageExecutionLog] (
[DTSPackageExecID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,
[DTSPackageName] [varchar] (30) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[Step] [int] NOT NULL ,
[Success]...
February 25, 2004 at 9:19 am
Can you just delete the data and do an insert statement from the current table to the backup.
Trancate Table Backup
Insert Into Backup
Select * from Original
Truncate Table Original
bulk insert...
February 25, 2004 at 8:58 am
If you reboot and it gets faster I would think the issue might be in your client applaction. It could be keeping resources open when it should be closing them. ...
February 25, 2004 at 8:47 am
I do not think that the DOS ftp connection will let you do this.
The FTP tool in DTS is pretty weak. If you are going to go through the trouble...
February 25, 2004 at 8:34 am
Defrag drives only has positive effects SQL Server. The more the fragmentation the more work it takes to read data.
The best setting is to let Diskkeeper run continually for several hours...
February 24, 2004 at 9:43 am
We have a similar situation to this. For our situation I found the best option is using command line Bulk Insert inside a stored procedure. It runs quicker and is...
February 24, 2004 at 8:21 am
You have to use some kind temp table.
This should work, I tested it with a couple of records
Select Min(Column1) as Column1,Column3,Count(Column3) as Count
Into #Temp_Keep
from Table1
Group By Column3
Having Count(Column3)>1
Delete From Table1
Where...
February 19, 2004 at 8:48 am
Viewing 12 posts - 1 through 12 (of 12 total)