April 3, 2009 at 11:37 am
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?
When it comes to data manipulation, I've not found any limitations yet.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 3, 2009 at 11:45 am
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?When it comes to data manipulation, I've not found any limitations yet.
Jeff, ummm what do you mean by datamaniplation? are you saying t-sql can compress / decompress data as easily as this?
byte[] decompressedBytes;
decompressedBytes = SevenZip.Compression.LZMA.SevenZipHelper.Decompress(vdata);
ms = new MemoryStream(decompressedBytes);
Please buddy, say it's so.
April 3, 2009 at 11:49 am
Chris Date's incoherency principle is in full force...
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
April 3, 2009 at 11:51 am
foxjazz (4/3/2009)
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?When it comes to data manipulation, I've not found any limitations yet.
Jeff, ummm what do you mean by datamaniplation? are you saying t-sql can compress / decompress data as easily as this?
byte[] decompressedBytes;
decompressedBytes = SevenZip.Compression.LZMA.SevenZipHelper.Decompress(vdata);
ms = new MemoryStream(decompressedBytes);
Please buddy, say it's so.
Nope... but I can call the compresser from T-SQL just like you can call it from anywhere else.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 3, 2009 at 12:02 pm
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?When it comes to data manipulation, I've not found any limitations yet.
Jeff, ummm what do you mean by datamaniplation? are you saying t-sql can compress / decompress data as easily as this?
byte[] decompressedBytes;
decompressedBytes = SevenZip.Compression.LZMA.SevenZipHelper.Decompress(vdata);
ms = new MemoryStream(decompressedBytes);
Please buddy, say it's so.
Nope... but I can call the compresser from T-SQL just like you can call it from anywhere else.
What I mean is that T-SQL doesn't do the compression anymore than you did the compression from C. Both, however, are capable of calling a program that does, so the answer is "Yes", just as easily as you did it from C.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 3, 2009 at 12:08 pm
I too shall unsubscribe from "Son Of Thread"...42 pages and counting...I'm impressed.
Cheers folks.
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
April 3, 2009 at 12:51 pm
Can't believe I read the whole thing. Thanks a lot for the link Steve!:-P
I can see where Fox is coming from, however I believe it is born from not wanting T-SQL to do ANY of the processing. I can certainly appreciate his position, considering all of those DAL classes I've written over the years for various object libraries. And if you used stored procedures, you had to!
Recently, I had a project that required a looping mechanism to update approximately 2 million rows. I decided against a cursor (truthfully, it would have been a cursor within a cursor), due to the strain it would cause on the production database.
Instead, I began experimenting with LINQ-to-SQL. The cool part was the data layer is generated for you, which meant intellisense for columns(properties) from your tables(objects). That was sweet. Here's the part that sucked. LINQ-to-SQL doesn't allow for set-based updates. You can write a foreach loop, update your property for each one, then submitChanges, however, .NET translates that into a discrete update statement for each row. See this post on msdn forum for more.
I read in a few posts on various sites about this issue, and it seems that this was a point of contention between the entity framework project and the LINQ-to-SQL people. Whatever the case, there were plans to do it, and then they didn't.
At the end of the day, I was able to finish the update project on time with the Linq-to-SQL solution. It took approximately 2 hours to run and did not impact my production users.
So, as an alternative to having to write T-SQL, Fox, I suggest looking at LINQ-to-SQL, even though it won't help your set-based problems much!
Mr Penny
April 3, 2009 at 1:05 pm
foxjazz (3/13/2009)
Why couldn't they write t-sql more like C# or have a flavored compiler for t-sql so that we could work it like c# or vb depending on how you like it. (or the t-sql language which really is the worst language MS has ever put on the block). I really don't understand why the sql server group didn't do a better job of it, or is it because they were taking thier queue from ORACLE.Really stupid paradigm if you ask me, and I think it needs to be improved upon.
Just got pointed over to this thread. T-SQL is based largely upon the ANSI/ISO SQL standard. ORACLE and MS have both taken their "cue" from ANSI/ISO. So has IBM and MySQL. T-SQL includes vendor-specific extensions, mostly to deal with the lack of procedural extensions and other shortcomings in the original ANSI SQL standards. You absolutely have the ability to connect to SQL Server and use C# or VB to manipulate the data it contains. In fact, you can do this without writing a single line of T-SQL code yourself fairly easily. T-SQL can definitely be improved, and in fact it is in a constant state of improvement, but I don't believe it's really the "worst language MS has ever put on the block". I can think of at least two languages that start with the letter "J" that rate higher (or lower depending on how you look at it).
BTW, T-SQL and C# are different languages because they have different goals. The paradigm of C# is you tell the computer what to do and how to do it one step at a time. Think of telling a robot how to buy a bottle of soda: "Take twelve steps forward, turn the doorknob, pull the door open, step outside, turn right, walk three blocks to the store..."
In T-SQL we define the problem in terms of the end result. In other words the paradigm changes from step-by-step instructions to explaining what we want to achieve. In T-SQL terms you'd tell the robot to "go buy a bottle of soda", and the SQL Server engine will worry about the details of how to get you that end result.
Basically I think a comparison of T-SQL and C# on the basis of syntax, without a comparison of their differing objectives, is pretty pointless.
April 3, 2009 at 2:40 pm
foxjazz (3/13/2009)
also instead of using cursurs like they are now, wouldn't it be easier to say:foreach( select myfield from tbl where myotherfield > 3)
{
insert into myothertable (myotherfield) values (myfield)
}
I think it would be easier if this were the case. No GSquared, I like t-sql, but I think it could easily be made a lot more powerful if the t-sql guys would have just get off their asses and did something significant.
OMG, who uses cursors with T-SQL? SQL Server is rowbased already.
INSERT INTO myothertable (myotherfield) SELECT myfield FROM tbl WHERE myotherfield > 3
Use cursors only after you made sure that ther is no other way.
In the it shows us that you may, of course, use a shoe to get a nail into a wall, but one shouldn't blame the hammer just because you don't know how to use it.
April 3, 2009 at 2:43 pm
foxjazz (4/3/2009)
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?When it comes to data manipulation, I've not found any limitations yet.
Jeff, ummm what do you mean by datamaniplation? are you saying t-sql can compress / decompress data as easily as this?
byte[] decompressedBytes;
decompressedBytes = SevenZip.Compression.LZMA.SevenZipHelper.Decompress(vdata);
ms = new MemoryStream(decompressedBytes);
Please buddy, say it's so.
That's not data manipulation Foxjazz. That's re-encoding, a specialized form of data transformation.
And even if it were, ("less easy" != "limitation"), again, different things.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
April 3, 2009 at 3:16 pm
Nevermind... this was answered by those who do things better than I 🙂
April 3, 2009 at 7:44 pm
Merry Krissmess (4/3/2009)
OMG, who uses cursors with T-SQL? SQL Server is rowbased already.
You might be surprised how many people manually override the SQL Server optimizer through judicious use of cursors. A lot of people starting out in SQL got catapulted into it from the comfort zone of a procedural language they're familiar with (C#, VB, C++, etc.). Many have a hard time wrapping their head around SQL's set-based paradigm and declarative syntax.
And we haven't even begun the discussion of three-valued logic yet 🙂
I'm more of a moderate on cursors than some of our peers out here. There are places where cursors will perform the task better, but they're few and far between. In 15 or so years of doing this I've only used cursors a handful of times, basically for administrative tasks where the data sets were small and the order of processing was far more critical than performance.
April 3, 2009 at 7:53 pm
foxjazz (4/3/2009)
Jeff Moden (4/3/2009)
foxjazz (4/3/2009)
But who am I to say anything about the limitations of t-sql?When it comes to data manipulation, I've not found any limitations yet.
Jeff, ummm what do you mean by datamaniplation? are you saying t-sql can compress / decompress data as easily as this?
byte[] decompressedBytes;
decompressedBytes = SevenZip.Compression.LZMA.SevenZipHelper.Decompress(vdata);
ms = new MemoryStream(decompressedBytes);
Please buddy, say it's so.
It's so. You can always slap a SQLCLR function in there that accesses the .NET Framework's compression functionality in a T-SQL statement. I've done it myself to help minimize network traffic in some situations. It looks a little more like this once the function is installed:
declare @decompressedBytes varbinary(max);
set @decompressedBytes = dbo.DecompressData(@compressedBytes);
Also SQL 2008 includes built-in functionality to compress data in physical storage without the need for you to do anything but indicate "hey, i want you to compress the data in this table..." Much easier than calling a function everytime you want to compress data in storage.
-------------------
What's more interesting to me is performing a left join against two tables in C#, VB, C++ with a projection and restriction more easily than a single statement like this:
SELECT *
FROM Customer c
LEFT JOIN CustomerAddress ca
ON c.AddressID = ca.AddressID
WHERE c.CustomerName LIKE 'SMI%';
April 3, 2009 at 8:59 pm
DCPeterson (4/3/2009)
Chris Date's incoherency principle is in full force...
Now that's funny! 🙂 The sql community has ignored someone whose spent a lifetime trying to explain why there's nothing relational about sql. If only the sql community could hear itself think. That giant sucking sound you do hear is the incoherent nature of sql. Maybe all those naive developers aren't so stupid after all 🙂
April 3, 2009 at 9:24 pm
This is Chris Date's last book it is only 230 pages 50 of those was used to insult ANSI SQL so you can understand why some readers call it grumpy theory. Chris main problem is the algebraic flaw of the SELECT statement but that is also the reason the object user beloved XML could be accepted into ANSI SQL without the relevant math.
The most important thing in software is implementation and that RDBMS succeeded beyond most things in software.
http://www.amazon.com/Database-Depth-Relational-Theory-Practitioners/dp/0596100124
Kind regards,
Gift Peddie
Viewing 15 posts - 406 through 420 (of 465 total)
You must be logged in to reply to this topic. Login to reply