Machine needs for developing and running sql

  • Hi

    I'm not sure this is the best forum but i'm hoping its the best fit.

    I am a developer mostly working on .net applications with a SQL back end, mostly stored procedures but moving into some Entity Framework. I'm wondering what I should be looking for in terms of a machine as my current work provided laptop seems to struggle at times. For example this afternoon running a fairly complex query on a big ish (100gb) database is causing my machine to have 100% IO and sqlserver is doing 15mb/s of IO, roughly equally spread between read and write.

    I have a relatively new 1tb seagate hybrid drive in the machine and 8gb of Ram, running in this case SQL 2008R2, but I need SQL2012 as well.

    Any and all suggestions and pointers greatfully received.

    Unfortunately refactoring the query at the moment is something thats not possible and its not just this one query that tends to load the machine up.

    Ben

  • Ben Whyall (8/20/2015)


    Hi

    I'm not sure this is the best forum but i'm hoping its the best fit.

    I am a developer mostly working on .net applications with a SQL back end, mostly stored procedures but moving into some Entity Framework. I'm wondering what I should be looking for in terms of a machine as my current work provided laptop seems to struggle at times. For example this afternoon running a fairly complex query on a big ish (100gb) database is causing my machine to have 100% IO and sqlserver is doing 15mb/s of IO, roughly equally spread between read and write.

    I have a relatively new 1tb seagate hybrid drive in the machine and 8gb of Ram, running in this case SQL 2008R2, but I need SQL2012 as well.

    Any and all suggestions and pointers greatfully received.

    Unfortunately refactoring the query at the moment is something thats not possible and its not just this one query that tends to load the machine up.

    Ben

    To be honest, it's probably not the machine you're having problems with. It's most likely the way the query is written and it would likely present a problem on even a server quality machine.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Why would you have a 100GB database in a developer workstation?

    I agree that improving the query would help. As you say that you can't at this moment, you should start writing better queries to prevent more issues in a near future.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Hmm.. depending on the query 8GB memory for a 100GB DB could be an issue. Improving the query might help some but if possible can you just work with a reduced data set?

  • To optimizes the performance of queries against large tables, SQL Server utilizes a data page cache and read-ahead buffering to reduce physical reads (disk) and maximize logical reads (cache). This is called the "cache hit ratio".

    http://www.johnsansom.com/the-sql-server-buffer-pool-and-buffer-cache-hit-ratio/

    100 GB is a large dataset to be using for development, and 8 GB of RAM isn't much to work with, especially when it's shared by the OS. You probably have SSMS and Visual Studio sitting there in the background holding 2 - 4 GB of RAM too. I don't think a laptop motherboard or your wallet could accomodate enough RAM to sufficiently cover ad-hoc queries against a 100 GB database.

    Try getting this database down to more like 10 GB at most. Also, if you used the Import Data Wizzard in SSMS to copy tables from production to your development laptop, then you need to follow that up by scripting and applying indexes, because the wizzard doesn't do that for you.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

  • Thanks for all the help so far, this is a production backup that I have had to bring down onto my machine to look at this specific problem. Part of this is to try and gather some third party views on the fact that 8gb of Ram isnt going to cut it with the amounts of data we're often processing. A Laptop generally is going to struggle.

    I will have to look at reducing the dataset the problem with that is that the data is relatively complex and interrelated and deletes and truncates of tables take hours in some cases to run.

    Is there a more efficient way to trim the data down.

    We have a major issue with data as is, because the system is secure and contains substantial amounts of personal information we need to scramble the data before it can be brought out of production let alone actually used for anysort of development.

  • Ben Whyall (8/20/2015)


    Thanks for all the help so far, this is a production backup that I have had to bring down onto my machine to look at this specific problem. Part of this is to try and gather some third party views on the fact that 8gb of Ram isnt going to cut it with the amounts of data we're often processing. A Laptop generally is going to struggle.

    I will have to look at reducing the dataset the problem with that is that the data is relatively complex and interrelated and deletes and truncates of tables take hours in some cases to run.

    Is there a more efficient way to trim the data down.

    We have a major issue with data as is, because the system is secure and contains substantial amounts of personal information we need to scramble the data before it can be brought out of production let alone actually used for anysort of development.

    Use Import SSMS Data Wizzard to copy only the tables you need, perhaps applying SELECT statement with WHERE clause to limit rows. Once done, you'll need to script out indexes for these tables from production and apply them to development.

    Also, if possible, exclude columns containing sensitive data.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply