The Full Potential of SQL 2000

  • BWHAAA-HAAAA! What life cycle? People can't even spell it anymore.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Certainly in the UK, we are beset by a number of failures of high-profile government IT initiatives. I feel sure that the average run-of-the-mill IT application within any organization is taking longer to complete. Even fifteen years ago, had I come up with an IT application to support a business that was going to take a year to implement, I'd have been laughed at.

    This is always going to be subjective, but there certainly seems to be a perception that applications just don't seem to be developed any faster for all the 'Rapid Application Development' going on. Jonathan Sachs developed the whole of Lotus 123 V1 in assembler code in just six months in 1982, and the whole project took only a year.

    Best wishes,
    Phil Factor

  • Phil Factor (12/15/2008)


    This is always going to be subjective, but there certainly seems to be a perception that applications just don't seem to be developed any faster for all the 'Rapid Application Development' going on. Jonathan Sachs developed the whole of Lotus 123 V1 in assembler code in just six months in 1982, and the whole project took only a year.

    I've often felt that application development was faster when the tools were simpler.

    In addition, there was much tighter control over the result. For example, a "hello world" can still be written using the DOS debug command...C:\> debug

    -n helloworld.com

    -a 100

    1552:0100 mov dx,010b

    1552:0103 mov ah,9

    1552:0105 int 21

    1552:0107 mov ah,4c

    1552:0109 int 21

    1552:010B db 'Hello, World!',0d,0a,'$'

    1552:011B

    -r cx

    CX 0000

    :1b

    -w

    Writing 0001B bytes

    -q

    The result is all of 27 bytes long.

    For a Visual Basic 2005 console application...Module HelloWorld

    Sub Main()

    Console.WriteLine("Hello, World!")

    End Sub

    End Module

    The compiled executable is 24.5 Mbyte!

    Derek

  • [font="Courier New"]// You can still do it in under 2K

    // Maybe we'll have to start a software movement for old

    // codgers like me who like writing compact software in assembler code

    .assembly extern mscorlib {} //Common Object Runtime Library

    .assembly HelloWorld

    {

    .ver 1:0:0:1

    } //we can add a lot more information in this block

    .module HelloWorld.exe //the module name of our assembly

    .method static void main() cil managed

    {

    .maxstack 1//max no. of items on the parameter stack

    .entrypoint

    ldstr "Hello world!"

    call void [mscorlib]System.Console::WriteLine (string)

    ret

    }[/font]

    Best wishes,
    Phil Factor

  • Heh...

    ECHO "Hello World"

    SELECT 'Hello World'

    PRINT 'Hello World'

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I see lots of web type apps being built quickly. Backpack (37 Signals) was built fairly quickly. I have seen corporate apps built in much less than a year, when it seems that early in my career everything took 1+ years.

    I'm not sure we build from scratch anymore, which shortens development time, but without having to build from scratch, I think we end up training programmers that aren't as capable. My vote is that everyone needs to learn C for a semester, then C++, work through pointers, sorts, and much of the fairly low level programming problems to understand how Java, .NET and other high level languages work.

    Lotus and some of those older apps aren't good templates because I think top-notch programmers are just way, way more efficient. And there are few of them out there, past or present.

  • Derek Dongray (12/15/2008)


    I've often felt that application development was faster when the tools were simpler.

    In addition, there was much tighter control over the result. For example, a "hello world" can still be written using the DOS debug command...C:\> debug

    -n helloworld.com

    -a 100

    1552:0100 mov dx,010b

    1552:0103 mov ah,9

    1552:0105 int 21

    1552:0107 mov ah,4c

    1552:0109 int 21

    1552:010B db 'Hello, World!',0d,0a,'$'

    1552:011B

    -r cx

    CX 0000

    :1b

    -w

    Writing 0001B bytes

    -q

    The result is all of 27 bytes long.

    For a Visual Basic 2005 console application...Module HelloWorld

    Sub Main()

    Console.WriteLine("Hello, World!")

    End Sub

    End Module

    The compiled executable is 24.5 Mbyte!

    This sounded like FUD so I thought I'd try it out. C# console app, 5k executable. And with code that's orders of magnitude easier to maintain. There's nothing even remotely simpler or faster about what you wrote. I built the example in under a minute and had output of a real world, distributable application. All you have is DOS script. What's the point you're trying to make here? 27 bytes of source that's good for nothing?

    Most people posting here sound completely detatched from reality and are completely ignoring the benefits modern tools give the developer. ASP.Net let's you build a scalable, thread-safe web app in a day. You can go back to C and write procedurally, or to C++ and have to manually manage memory again, but there's a reason people don't and why you'll never get a business case off the ground to do so. Hey, chuck out web services and go back to writing proprietary APIs coded directly to the TCP/IP stack while you're at it. That'll make your app faster to write and resuable for partners, I'm sure.

    The reason apps take longer to write nowadays is they're bigger, talk to more systems, manage more data, and have more bureaucracy as a result. Crappier tools won't change any of that. If you don't want to design an enterprise data model to aggregate your systems you don't have to. If you don't want to go to the effort of building using SOA principles you can still do one off "enhancements" on two systems to move data with flat files. But you get what you pay for and there's a reason these are considered bad practices in the real world.

  • sausage.dog (12/15/2008)


    Derek Dongray (12/15/2008)


    This sounded like FUD so I thought I'd try it out. C# console app, 5k executable. And with code that's orders of magnitude easier to maintain. There's nothing even remotely simpler or faster about what you wrote. I built the example in under a minute and had output of a real world, distributable application. All you have is DOS script. What's the point you're trying to make here? 27 bytes of source that's good for nothing?

    Most people posting here sound completely detatched from reality and are completely ignoring the benefits modern tools give the developer. ASP.Net let's you build a scalable, thread-safe web app in a day. You can go back to C and write procedurally, or to C++ and have to manually manage memory again, but there's a reason people don't and why you'll never get a business case off the ground to do so. Hey, chuck out web services and go back to writing proprietary APIs coded directly to the TCP/IP stack while you're at it. That'll make your app faster to write and resuable for partners, I'm sure.

    The reason apps take longer to write nowadays is they're bigger, talk to more systems, manage more data, and have more bureaucracy as a result. Crappier tools won't change any of that. If you don't want to design an enterprise data model to aggregate your systems you don't have to. If you don't want to go to the effort of building using SOA principles you can still do one off "enhancements" on two systems to move data with flat files. But you get what you pay for and there's a reason these are considered bad practices in the real world.

    OK. I been developing software for about 30 years, so have this vague idea that I understand it. The list of languages I've worked with include assembly code for many platforms (IBM 360, DEC PDP-7, -8, -10, -11 & -15, Intel x86 series...)as well as assorted high-level compiled and scripted languages such as PL/1, Fortran, Basic (many flavours), Pascal, C, C++, VB, Perl, PHP, SQL, etc. Applications I've worked on range from simple single-use apps to large applications used by thousands of users. I also worked on compilers, which is why I am certain that applications have got slower.

    My example was written using the DOS debug command because I no longer have an assempler installed. The point was to show that current compilers are terribly inefficient and that the user interface actually prevents the developer from writing efficient code. I would say it's impossible to tell whether a 27byte program is faster than a 5k one because the execution time of either would be milliseconds and most of this would be in overhead external to the actual code.

    I agree that writing a "Hello, World!" program (or any program) in assembler these days is probably overkill (as well as the fact that the program is pointless). But my point is that the tools do not give enough control to reach the optimal solution. I know that to do the job I want done (output "Hello, World!") the minimum needed is 27 bytes, but, as you pointed out yourself, there's no way to get close because the tools insist on including checks which may or may not be needed. "Hello, World!" only uses constants, so there is no need for memory management, but you can bet that all the check routines for stack overflow and so forth are getting dragged in even if there is no chance of them being triggered.

    I also agree that, for maintainability and speed of development, it's much better to work in as abstract a language as possible and let the tools take care of the details. But the problem is that the overheads resulting from this abstraction invariably results in slower code and, very often, you can't provide the necessary hints to reduce the overhead or, if you can, people don't bother.

    In SQL terms, this means that although SQL is supposed to be declarative (tell the server what you need not how to get it), since the tool (in this this case the optimiser) is just another application with limitations, the developer very often ought to be guiding it how to get the best result.

    But the trouble is that a lot of modern developers tend not to do this (they just argue that you need a faster server) so applications run slower than they need to.

    Apologies for the length of this, but it is something I feel strongly about. I beleive developers should not just write code that works and let the tool optimise it, but should think about all the steps (even in SQL) and see if they can improve on the speed of the result. As I mentioned above, far too many people stop at "it works".

    Derek

  • Heh... Well said, Derek... I forget which year Bill Gates said it would happen, but he once said something to the effect that sometime in the very near future, there will be no true programmers... just users.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Derek,

    Interesting and a nice argument.

    However I'd argue that we need lots of checks in there. It can be the tools adding them, but we are more interconnected, there are more people looking for issues and vulnerabilities in applications in ways that didn't occur when we wrote in assembler.

    If code is slower, we need more horsepower, and more optimized tools that can make safe, secure applications run well.

  • Hi Steve

    I agree that in a 'proper' application, you'd need various checks and error handling. And when the developer, after using all the optimizing tools, really can't get any more speed or reduce the size or whatever, then the only option is to upgrade the hardware.

    However, optimizing tools are just pieces of software, so they also suffer from the 'it works' syndrome, hence may be programmed to always include the checks because it's too hard to do further analysis and see if they are actually needed. In any event, the tool often can't do a proper cost-benefit anaylsis to determine how much resource to spend optimizing, because it won't have a usage profile.

    The bottom line is that the developer needs to have an idea how what they are writing will be used and should then focus on those areas where an improvement will pay off and not just assume that the tool will always produce the best code.

    Derek

  • Steve Jones - Editor (12/16/2008)


    Derek,

    Interesting and a nice argument.

    However I'd argue that we need lots of checks in there. It can be the tools adding them, but we are more interconnected, there are more people looking for issues and vulnerabilities in applications in ways that didn't occur when we wrote in assembler.

    If code is slower, we need more horsepower, and more optimized tools that can make safe, secure applications run well.

    Here's an interesting paper out of MS Research that puts the last 30 years of software development into perspective:

    Spending Moore’s Dividend

    by James Larus

    5/2/2008

    ftp://ftp.research.microsoft.com/pub/tr/TR-2008-69.pdf

    The opening quotation says it all, "Grove giveth and Gates taketh away." The great silicon dividend has been wasted. How can anyone gloat over all the bloat:) As for sql does anyone know where the inside of sql server ends and the outside begins? But the author contends we have a second chance to do it right:) In any event James didn't make many friends at MS with this paper:)

    www.beyondsql.blogspot.com

  • rog pike (12/17/2008)


    The great silicon dividend has been wasted. How can anyone gloat over all the bloat

    Heh... man, do I agree with that! How many times have we heard "upgrade the hardware" as the fix for slow code? The only good part about the bloat is that incredibly large and fast harddisks and gobs of memory have become relatively inexpensive.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Co-worker at my previous employer really like Power Basic. Creates extremely fast and tight code. They recently added a /bloat switch that makes the executable bigger, but still runs as fast.

Viewing 15 posts - 16 through 30 (of 36 total)

You must be logged in to reply to this topic. Login to reply