Articles

SQLServerCentral Article

Enforcing Referential integrity in Microsoft SQL Server 2000

Referential Ingegrity is a critical part of any well designed RDBMS application, not just a part of Oracle, DB2, or some other platform. SQL Server has tools to make it easy, but a developer has to take the time to ensure that it is setup correctly, and ignorance is no excuse. New author Nick Duckstein brings us a look at basic RI and how you can set this up in your database.

5 (2)

You rated this post out of 5. Change rating

2005-01-17

14,204 reads

Technical Article

SQL Server 2005 - Managed execution

The next version of SQL Server named SQL Server 2005 is completely hyped with the integration of CLR into SQL Server. The introduction of CLR into SQL Server allows developers to write stored procedures, triggers, user defined functions, user defined aggregates and user defined types using .NET languages like VB.NET and C#. This introduction has opened up multiple avenues for developers and we need to be careful in maximizing the feature provided.

2005-01-14

3,007 reads

SQLServerCentral Article

Merge Replication - Manual Range Handling

SQL Server 2000 replication is a great feature, but it can cause some headaches at times. Since the use of identities is something many people take advantage of, learning to handle these in a replication scenario is critical. Author Paul Ibison has done extensive work with replication and brings us two techniques to help manage the ranges of values.

5 (2)

You rated this post out of 5. Change rating

2005-01-13

12,757 reads

Technical Article

SQL Server 2005: Integrating SQL, XML, and XQuery

The evolution of SQL and the XML Query Language (XQuery) continues with the work of the World Wide Web Consortium (W3C) and the InterNational Committee for Information Technology Standards (INCITS). Providers of SQL database management systems have upgraded products such as Microsoft SQL Server to support the storage and retrieval of XML documents. Microsoft has provided stored procedures and Transact-SQL extensions for working with XML. On the horizon are even more changes as Microsoft introduces SQL Server 2005. (MP3 Audio)

2005-01-13

1,665 reads

Technical Article

Understanding "Yukon" Schema Separation

Well it has finally arrived, at least in the Beta version. Microsoft's long awaited latest version of it's SQL Server product has arrived in Beta version and holds promise to be a major and successful revision of this fine product. I have had the Beta version for a few months now and one of the new security items that has intrigued me the most is the separation of users and schemas. I've worked with this form of separation before in Microsoft's chief competitor, but this article is not a comparison of the two products or the way they implement schema separation; it is an article on the basics of user/schema separation for those SQL Server DBAs who may have not worked with separated schema separation before.

2005-01-12

2,354 reads

SQLServerCentral Article

Can You Compute?

Transact-SQL in SQL Server 2000 has some interesting features, many of which most DBAs will never use. While many DBAs are famliar with the basic aggregate functions, there are a few that are advanced and not well understood. The ROLLUP and COMPUTE operators are two of these and David Poole takes a look at how these work and a practical application for them.

You rated this post out of 5. Change rating

2005-01-11

10,657 reads

Technical Article

Trace-scrubbing Tools

Andrew Zanevsky shares his trace-scrubbing procedures that make it easy for you to handle large trace files and aggregate transactions by type–even when captured T-SQL code has variations.

SQL Server Profiler is a veritable treasure trove when it comes to helping DBAs optimize their T-SQL code. But, the surfeit of riches (I'm reminded of the Arabian Nights tale of Aladdin) can be overwhelming. I recently had one of those "sinking" feelings when I first tried to make sense of the enormous amount of data collected by traces on a client's servers. At this particular client, the online transactions processing system executes more than 4 million database transactions per hour. That means that even a 30-minute trace that captures "SQL Batch Completed" events results in a table with 2 million rows. Of course, it's simply impractical to process so many records without some automation, and even selecting the longest or most expensive transactions doesn't necessarily help in identifying bottlenecks. After all, short transactions can be the culprits of poor performance when executed thousands of times per minute.

2005-01-11

2,003 reads

Blogs

2024 PASS Data Community Summit Prep

By

Next week is the 2024 PASS Data Community Summit in Seattle. I’ll be traveling...

A New Word: Bye-over

By

bye-over – n.  the sheepish casual vibe between two people who’ve shred an emotional...

Free webinar – Tackling the Gaps and Islands Problem with T-SQL Window Functions

By

I’m hosting a free webinar at MSSQLTips.com at the 19th of December 2024, 6PM...

Read the latest Blogs

Forums

PASS Summit Time

By Louis Davidson (@drsql)

Comments posted to this topic are about the item PASS Summit Time

database restore chain

By sqlfriend

I have a backup of full, differential and transaction log setup for our database....

Temporary Table Problem

By fk.da

Hello everyone, I hope you can help me. I have a table with measurement...

Visit the forum

Question of the Day

Incremental Statistics

I have run this on SQL Server 2022 for the Sales database:

ALTER DATABASE Sales SET AUTO_CREATE_STATISTICS ON (INCREMENTAL = ON)
I then run this in the Sales database:
USE Sales
GO
CREATE STATISTICS CustomerStats1 ON dbo.Customer (CustomerKey, EmailAddress) WITH INCREMENTAL = OFF
The dbo.Customer table is partitioned. How are statistics created?

See possible answers