May 31, 2006 at 10:25 am
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/rVasant/xmlgoodandbad.asp
June 5, 2006 at 4:58 am
It's very helpful and meaningful.
June 5, 2006 at 6:15 am
Always knew. XML is a one of the bad thinkg ever created
June 5, 2006 at 6:57 am
Everything in moderation... I personally do not make a lot of use of XML at all. I think the use of XML in standards is beneficial due to the wide array of software tools built to adhere to those standards - eg web services. Certainly saves time rather than coding HTTP POST, etc
But using it internally when some arrays, etc would do doesn't make a lot of sense. There's a lot of overhead with string processing, etc. So I'm not surprised that things slowed down significantly - no XML could compare to a simple array of structs and some pointers... Hope you don't get too flamed by ppl for saying XML is not the saviour of us programmers/developers/db folk
Cheers
June 5, 2006 at 7:31 am
The article started out nicely but I feel it dropped off at the end. We are told where the problem was, but there was not mention, even in a short sentence, of how the issue was resolved. It just said there was alot of rework. I would suggest adding a section called "The Solution" prior to "Conclusion" to give the article more punch.
Well written overall, it's just missing the punchline.
June 5, 2006 at 7:33 am
This article is terrible! You're not really saying anything!
I think the real lesson here is that you made the decision to utilize XML (A technology to which you seem very unfamiliar) extensively without investigating the technology and testing your ideas in a sandbox environment. XML is great at what it does but why would you use XML in this application when the database already provides your storage mechanism and you did not need to share data between heterogeneous systems...Frankly this use of XML adds another layer of complexity with no benefit.
You would be much better off utilizing a basic "domain collection/object model" architecture where domain objects are populated with data from your database...Recursion would have been easier and much faster...
June 5, 2006 at 7:41 am
Can we get an editor in here? Bad grammar and "OVER-USE!!!" of commas is distracting and unprofessional.
This article doesn't give a solution, or a real description of the problem. I could easily see this being poor programming, not an inherent problem with XML.
June 5, 2006 at 7:47 am
I guess this wouldn't be related to XML or your article, it's more of a Solution. Use DevExpress' TreeList component or create your directory structure from a single table instead of an Xml file / Document.
Table Structure:
ID | ParentID | Description | Hint |
1 | Null | C:\ | Root Drive |
2 | 1 | Program Files | |
3 | 1 | Windows | Root Windows Folder |
4 | 2 | Microsoft Office |
In this way, you have an unlimited amount of folders / subfolders by using a combination of ID and ParentID. We use this structure a lot and it's very fast an efficient.
June 5, 2006 at 9:55 am
You are right... RECURSION was used to solve the problem.
Still, i feel that updating thousands of rows in a single go can be easily done with XML and using database support for XML. And it is really helpful, when the details are scattered on different servers.
thnx.
June 5, 2006 at 12:24 pm
Use of XML can facilitate readability if the schema and tags are appropriately designed. However, I'm far from convinced of its superiority in all areas.
The main issue for me is that as a data exchange format, it tends to be quite verbose (in terms of the ratio of data bytes to total bytes). Because of this, when used in interprocess communications to exchange data, performance can become an issue. And when used to create files for other applications, the files can be very large in proportion to the data they contain. I'm an old database dog and still like fixed-byte or CSV format for stuff like that, because it's more parsimonious.
June 5, 2006 at 11:33 pm
The details displayed are not very intutive for the user in your solution. It is better if you provide TREE like structure for your folders, which can make the users task easier.
thnx
June 6, 2006 at 7:24 am
I was not overwhelmed with either the explanation of the problem, solution, or the conclusion. I am even unclear if the real problem is the algorithm used for creating and then parsing the XML or the actual creation of the XML file. Having worked with XML, this approach should work fine for exchanges of data and data definition structures. However, part way through this article, the developer hints that "speed" is the problem. The totallity of this article appears to be a side-bar note to another developer so that they can commiserate on the reality that there is no perfect development tool.
I would have found this article of some value had there been a real description of the problem (# of records, goals of the project, # of directories, # of databases, reason for the project in the first place, etc .etc.). It would have been even nicier had there been even a brief discussion of the algorithm approach. It mentions recursion...and having done recursive functions in AI design for years, I have seen even experienced programmers create some of the most inefficient recursion code possible. And finally, the article provides no solid solution approach showing 1) what went wrong, 2) how the "wrong" was identified, and 3) how the solution was so much better. Did the author just use default XML builds from SQL? If so, why not complain that VS2005 does not do a better job of writing the code for you? Maybe because, more experienced developers understand that 99% of the time, its a person problem and not a tech tool problem.
Other than these basic criticisms, I would say that SQL Server Central editors were hard pressed to find something of value to link to other than a brief commentary from a programmer that had a bad week and then provided us a fragmented review of his experience.
'nuff said
thanks for the effort...
June 13, 2006 at 7:44 am
I cannot see how even 1000 sibling directories should create a bottleneck. To me, another typical design mistake seems to have been made by the author namely that all data was transmitted at once, i.e. all directories and subdirectories with files. Only the first level should be queried and displayed firsthand and if the user expands a node the application should query again for the next level and so on.
Anyway what's the use of storing the contents of an ever changing file system in a database??
June 5, 2007 at 6:31 am
Sorry, but one of the first sentences:
"Because of the small size of files, the data transfer speed has also increased considerably, especially for web applications"
blew it for me. There's no way anyone that knows much on the subject can possibly consider XML files as having "small size" when compared to most of the alternatives out there.
XML is somewhat self documenting and usually human readable, but size and efficiency are not it's strong points.
June 21, 2007 at 2:52 am
Sorry. This article is devoid of any merit or insight.
If I can summarize it, the salient points are:
Xml is useful. But dont overuse it.
Did I miss anything?
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply