Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

XML - Good and Bad Expand / Collapse
Author
Message
Posted Monday, June 5, 2006 11:33 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Wednesday, May 8, 2013 7:23 AM
Points: 199, Visits: 136

The details displayed are not very intutive for the user in your solution. It is better if you provide TREE like structure for your folders, which can make the users task easier.

thnx

 

Post #285081
Posted Tuesday, June 6, 2006 7:24 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, November 25, 2014 6:16 PM
Points: 16, Visits: 40

I was not overwhelmed with either the explanation of the problem, solution, or the conclusion.  I am even unclear if the real problem is the algorithm used for creating and then parsing the XML or the actual creation of the XML file.  Having worked with XML, this approach should work fine for exchanges of data and data definition structures.  However, part way through this article, the developer hints that "speed" is the problem.  The totallity of this article appears to be a side-bar note to another developer so that they can commiserate on the reality that there is no perfect development tool.

I would have found this article of some value had there been a real description of the problem (# of records, goals of the project, # of directories, # of databases, reason for the project in the first place, etc .etc.).  It would have been even nicier had there been even a brief discussion of the algorithm approach.  It mentions recursion...and having done recursive functions in AI design for years, I have seen even experienced programmers create some of the most inefficient recursion code possible.  And finally, the article provides no solid solution approach showing 1) what went wrong, 2) how the "wrong" was identified, and 3) how the solution was so much better.  Did the author just use default XML builds from SQL?  If so, why not complain that VS2005 does not do a better job of writing the code for you?  Maybe because, more experienced developers understand that 99% of the time, its a person problem and not a tech tool problem.

Other than these basic criticisms, I would say that SQL Server Central editors were hard pressed to find something of value to link to other than a brief commentary from a programmer that had a bad week and then provided us a fragmented review of his experience.

'nuff said

thanks for the effort...




Post #285213
Posted Tuesday, June 13, 2006 7:44 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Friday, September 7, 2012 12:30 PM
Points: 37, Visits: 136

I cannot see how even 1000 sibling directories should create a bottleneck. To me, another typical design mistake seems to have been made by the author namely that all data was transmitted at once, i.e. all directories and subdirectories with files. Only the first level should be queried and displayed firsthand and if the user expands a node the application should query again for the next level and so on.

Anyway what's the use of storing the contents of an ever changing file system in a database??

Post #287021
Posted Tuesday, June 5, 2007 6:31 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, June 6, 2007 1:55 PM
Points: 4, Visits: 1
Sorry, but one of the first sentences:

"Because of the small size of files, the data transfer speed has also increased considerably, especially for web applications"

blew it for me. There's no way anyone that knows much on the subject can possibly consider XML files as having "small size" when compared to most of the alternatives out there.
XML is somewhat self documenting and usually human readable, but size and efficiency are not it's strong points.



Post #371289
Posted Thursday, June 21, 2007 2:52 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Tuesday, March 8, 2011 6:58 AM
Points: 11, Visits: 14

Sorry. This article is devoid of any merit or insight.

If I can summarize it, the salient points are:

Xml is useful. But dont overuse it.

Did I miss anything?

Post #375508
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse