Google Scale

  • Comments posted to this topic are about the item Google Scale

  • that should mean some regular effort to improve your skills and become better at your craft

    Absolutely. It irritates me to see people that are complacent with what they know. However, it is these people that are, day by day, making their own skills obsolete.

    I strive to learn something new every day, no matter how small.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • Great article.

    I think the younger the programmer, the less attention to small detail...

  • I liked today's article, I never thought about the implications of Gs before.

    What I noticed in particular was how you felt you track things more closely because you come from a time where resources were more constrained. I was going to jump in and say, "Me too!", because I've been around since the 2400 baud modems and 8086 assembly language days (so, a little after yourself), and I feel like I "care" that things are packed into efficient structures and data types.

    But at the same time (note: I'm a hobby or support programmer, not a professional developer) I made a choice a long time ago to "let some things go" in the interest of more readable code; which always felt better to me than squeezing out an optimisation that only the inventor understands at a glance (or sometimes even after much study).

    And now, working in VB.NET as I do (hey, that's what SQL Server Reporting Services allows), I have to take an even more circuitous route than ever before, not that that's a bad thing.

    I've also worked with a lot of older school programmers from the days of VAX and such. While some of them truly are experts at squeezing the most out of some code, others have more of a philosophy like me; that is making code more verbose and leaving the compiler to do most of the optimisation work.

    I also realise that modern (more recently educated) programmers have to learn a lot of tight algorithms and such when earning their degrees, so in many ways are likely far superior to my programming. I can't say for sure, so I can't say, "My code is better because I've been around longer," can I?

    And so, I just can say, "I feel that way too, but I think everyone probably does, and it's all very vague and likely not true, as much as we wish it was" 🙂

  • Efficiency is a good goal and one I strive for as a developer. But I always take time to make sure my code is maintainable so someone besides myself can maintain or reuse it.

  • paul s-306273 (11/16/2010)


    I think the younger the programmer, the less attention to small detail...

    I think I'd have to agree with you and I am a younger programmer.



    The opinions expressed herein are strictly personal and do not necessarily reflect the views or policies of my employer.

  • "Jeff Moden often quotes, it doesn't take any more time to do it right the first time."

    I was struck by this comment in today's editorial. Struck by the simplicity of the comment and just how completely wrong it is.

    If Jeff's comment were correct, then building a custom home would mean buying some wood and nails and just start whacking them together. We wouldn't need architects, building codes, inspectors, testing or any such thing. But in fact, thats not how building custom homes works - I know, I've built two over the last forty years. In fact, its a very long and involved process.

    Likewise, in coding, to simply presume that writing "good code" means its the same effort as writing "crap code", we wouldn't need coding guidelines, testing, QA, regression testing, or any 'spit and polish' activities. Of course, the very article that this comment appears in describes how Gs goes out of its way to ensure code is "the best it can be", so it seems odd to then include Jeff's comment - the complete opposite of what is being presented in the editorial!

    As well, it might be that in Jeff's world, he is always writing clean new code and I have no doubt he is a SQL Wizard. But for most average developers, the best code does not come without some involved, time-consuming extended efforts. And writing entire software systems in .NET? Well, I don't know if Jeff undertakes those efforts but I can assure you NO ONE bangs out "perfect" code on the first shot, let alone in the same time it takes to produce tested and optimal code.

    So in fact, anyone who presumes that "doing it right the first time" takes the same amount of time as actually designing, testing, engineering, testing again, polishing, testing again... and so on... Well, no insult intended to Jeff, but its just an absurd idea. Optimal code takes much more time that just banging out something that "works".

    After all, if this thesis held any water, NASA would just buy a few garbage cans at their local Ace Hardware, weld them together, fill them with gasoline, duct tape some poor astronaut to the top, and shoot for the moon. Thats not how it, or anything involved and complex, is engineered.

    There's no such thing as dumb questions, only poorly thought-out answers...
  • blandry (11/16/2010)


    "Jeff Moden often quotes, it doesn't take any more time to do it right the first time."

    I was struck by this comment in today's editorial. Struck by the simplicity of the comment and just how completely wrong it is.

    If Jeff's comment were correct, then building a custom home would mean buying some wood and nails and just start whacking them together. We wouldn't need architects, building codes, inspectors, testing or any such thing. But in fact, thats not how building custom homes works - I know, I've built two over the last forty years. In fact, its a very long and involved process.

    Likewise, in coding, to simply presume that writing "good code" means its the same effort as writing "crap code", we wouldn't need coding guidelines, testing, QA, regression testing, or any 'spit and polish' activities. Of course, the very article that this comment appears in describes how Gs goes out of its way to ensure code is "the best it can be", so it seems odd to then include Jeff's comment - the complete opposite of what is being presented in the editorial!

    As well, it might be that in Jeff's world, he is always writing clean new code and I have no doubt he is a SQL Wizard. But for most average developers, the best code does not come without some involved, time-consuming extended efforts. And writing entire software systems in .NET? Well, I don't know if Jeff undertakes those efforts but I can assure you NO ONE bangs out "perfect" code on the first shot, let alone in the same time it takes to produce tested and optimal code.

    So in fact, anyone who presumes that "doing it right the first time" takes the same amount of time as actually designing, testing, engineering, testing again, polishing, testing again... and so on... Well, no insult intended to Jeff, but its just an absurd idea. Optimal code takes much more time that just banging out something that "works".

    After all, if this thesis held any water, NASA would just buy a few garbage cans at their local Ace Hardware, weld them together, fill them with gasoline, duct tape some poor astronaut to the top, and shoot for the moon. Thats not how it, or anything involved and complex, is engineered.

    I think what Jeff implies with that statement is just "whipping something together for now" instead of taking time to engineer a better way doesn't mean you saved time, because you end up having to go back and fix/rewrite it. Better to just take the time to engineer a good, well-thought-out system then whip up a half-baked cocktail, because the time you save will end up being used up in fixing it down the road. I think his point is that it all balances out in the end.

  • My son recently started working for Google, working on backend systems. Should be an interesting job.

    ...

    -- FORTRAN manual for Xerox Computers --

  • blandry (11/16/2010)


    "Jeff Moden often quotes, it doesn't take any more time to do it right the first time."

    I was struck by this comment in today's editorial. Struck by the simplicity of the comment and just how completely wrong it is.

    I am with you on this - Jeff Moden's comments make no sense to me as well.

    Getting back to Google's path to success, AFAIK they did not write highly scalable code the first time either. They started to concentrate on perfect highly scalable code when it began to make sense economically, when they began to have enough scale to justify it. I am not a big expert on Google's history, but I did read some book, and what I have read make sense to me.

    When we prototype, we do not productionize the code yet, and those who do most likely lose money and get out of business. There are plenty of real life examples for that.

    When we develop for 10 users and 100K rows, we make it fast enough for 10 users and 100K rows. Making it fast enough for 2K users and 5TB takes considerably more time. When we have the requirements, time and money for a system for 10 users and 100K rows, and try to develop something much bigger instead, we usually fail.

    Perfect code is very rare and very expensive. It is rarely needed.

  • Experienced developers, like Jeff, understand that doing a crap job the first time around and then having to fix it is much more expensive than doing a good job the first time, and I’m sure that is the point of his comment.

    Doing a sloppy job of developing a database model leads to a physical model that does not accurately model the data which leads to ugly, expensive hacks to work around the design errors which leads to hard to debug and optimize code full of errors which leads to bad performance and unhappy users which leads to an expensive rewrite. It’s an old story and anyone who has been in the business for very long has a hundred stories just like it.

    Google probably understands better that most just how expensive screw-ups like that are at their scale, but the cost is there for every business that does a poor job of developing software. Building a bridge that falls down may be a more spectacular failure, but building an outhouse that falls down is still a failure and both can be avoided.

    Sadly, there are many developers who see doing a poor job the first time as the normal way of doing things.

  • I feel like this is a very different discussion for a DBA than for a programmer, for two reasons.

    First, DBAs can impact performance of a database without ever touching the hardware or a single line of code. We can add indexes, maintain statistics, defragment, partition, etc. Developers don't have these options. The only way to make a program run faster is to either recode it or upgrade the hardware. I'm not saying that DBAs have unlimited performance tuning potential either, but without changing code we have our bag of tools that can be used before taking the "last resort" of buying new hardware ... so much so that even after we've spent days or weeks and exhausted all database tuning options, we still feel like we're "giving up" by going the hardware route.

    The second reason, and this is some conjecture on my part, is that the programming world has far more horsepower to play with than the DBA world. CPU speed and core count on a modern server is getting to the point where it's analgous to me buying a car based on whether it can go 150mph or 200mph. It just doesn't really matter that much. I wish DBAs could live in that world!! Everything would be table scans. But we don't -- IOPS is a far rarer resource than CPU cycles, and we still go through a lot of hoops to conserve them.

    For Google it probably makes sense to pay people just to optimize their code by even 10%. I suspect that most companies don't run that way; I know mine sure doesn't and so I do what I can to get it right the first time. And, I'll continue to pay attention to the little details hopefully for a long time to come.

  • Michael Valentine Jones (11/16/2010)


    Experienced developers, like Jeff, understand that doing a crap job the first time around and then having to fix it is much more expensive than doing a good job the first time, and I’m sure that is the point of his comment.

    Doing a sloppy job of developing a database model leads to a physical model that does not accurately model the data which leads to ugly, expensive hacks to work around the design errors which leads to hard to debug and optimize code full of errors which leads to bad performance and unhappy users which leads to an expensive rewrite. It’s an old story and anyone who has been in the business for very long has a hundred stories just like it.

    Google probably understands better that most just how expensive screw-ups like that are at their scale, but the cost is there for every business that does a poor job of developing software. Building a bridge that falls down may be a more spectacular failure, but building an outhouse that falls down is still a failure and both can be avoided.

    Sadly, there are many developers who see doing a poor job the first time as the normal way of doing things.

    The only way in which I would disagree with this sentiment is that it doesn't go far enough. My experience is that it is usually cheaper to do it right the first time, rather than lashing two cocktail straws together and calling it a framework. A problem left to fester in production for months on end before "blowing up" ends up coming back much harder to fix.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • I've only had to seriously work at googlescale once. The company did a tremendous volume of broadcast sampling. I'm being vague because I'm still not sure those NDA's are cleared yet on the proprietary stuff. 500+ computers working simultaneously each doing sound scrapes. All of these hammering on a single DB for LOB data and information access/insert/updates. We had in the range of 30,000 transactions per hour per computer, counting reads.

    That was some intense coding, and I wasn't up to par with how good I needed to be there. We've gone our separate ways as friends and I did clean a lot of things up before I left, but there are times I'd like to go back with what I know now and see what I could do better.

    10% can bring a system, eventually, to a screaming, crying, dual broken legged ski trip style halt eventually. It could destroy a business. Google fighting for 10% makes a ton of sense.

    Bob's AutoMart on the other hand can do triangle joins all day for readability and maintainability. Seriously, who cares? The 10 year old laptop they run it on could handle 20 more db's with 5x the number of current users and still never notice.

    It's perspective. Hone your craft, sure, but hone it in ways you need to. If you're not planning on leaving Bob's AutoMart in the near future, concentrate on what things in Denali can get Bob more profitable (and you a raise). They don't need to care if ISNULL or COALESCE has a 7% time difference in the millisecond range.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

  • There's always a point, beyond which, no amount of optimizing anything has any measurable positive effect. There's also a point where the cost of increasing the optimization exceeds the benefit of it. The trick is identifying that point.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

Viewing 15 posts - 1 through 15 (of 23 total)

You must be logged in to reply to this topic. Login to reply