Forum Replies Created

Viewing 15 posts - 16 through 30 (of 88 total)

  • RE: SQL Server Log data missing

    The database is in Full Recovery. We have the full backup of the state prior to data loss and data entry too.But we do not have log backup as it...

  • RE: Sysobjects table columns

    Thanks for the link. I got this doubt because we have one application which lists the tables in a database for further processing.

    A table created by sa login was not...

  • RE: Searching with like operator(billions of rows)

    Yes. I did exactly that. It took 2.51 hrs to return 150K rows. I think it also depends on what I select in output list. I think selecting only id...

  • RE: Searching with like operator(billions of rows)

    Thanks Abhijeet. Can you tell me how that works internally.

  • RE: Searching with like operator(billions of rows)

    Table1 contains name & address concatenated together.

    ID Col1

    1 Name1,address1

    2 name2,address2..

    There are certain set of words stored in col2 of table2(~300) and those words are...

  • RE: Best possible join

    Thanks for the reply. What should be the ideal index plan for these two tables?

  • RE: Concurrency

    You could possibly manage it with the SERIALIZABLE isolation level and the READPAST hint, selecting 10 records from each thread and holding the transaction open while you processed them

    I tried...

  • RE: Concurrency

    SELECT TOP 10 Col1,Col2,Col3

    FROM myTable WITH(READPAST)

    WHERE Col1='XYZ'

    AND Col2='ABC'

    I believe this will get you ten records that are not locked for update.

    Can I lock the records / rows while...

  • RE: Concurrency

    Yes. You are right.

  • RE: Concurrency

    Hi Chris,

    Thanks for the reply.

    The records will not be actually preassigned. The records will be allocated sequentially& dynamically i.e. any top 10 records should be assigned which are not...

  • RE: Updating millions of records

    Does the free space on hard disk matter(Considering recovery model Simple).

    We have ~20GB free space on the drive where database files are stored.

  • RE: Updating millions of records

    Thanks a lot.. This information is really valuable.I will check out the same. Will post it here if find anything interesting.

  • RE: Updating millions of records

    Thanks Jeff! We have changed the database recovery model to Simple for this database. I will check how much the log grows after the respective team runs their matching queries.

  • RE: Updating millions of records

    Hi Jeff,

    We do srhink the log but not every night. We receive the data in lots (45K to 400k) which needs to be matched with the data in this table.

    Once...

  • RE: Updating millions of records

    basically we do not have any application accessing this table therefore we did not plan for normalization.

    We are dropping the indexes and recreating it every month.The next month onward we...

Viewing 15 posts - 16 through 30 (of 88 total)