Ideas for setting Azure SQLmanaged instance for transaction and reporting

  • Grant Fritchey - Monday, September 10, 2018 7:55 AM

    Strictly speaking, no. Again, we pay for a limit. What we do within that limit doesn't affect the limit.

    Don't get me wrong. I'm not advocating for bad code. I'm not saying we shouldn't tune or that tuning is not a net positive. I'm being very specific. The OP said "you have to pay for each table scan" and that single statement is untrue.

    Let's take an example. I have a database on my portal right now. It's Standard, S0, $15/month with a 10 DTU limit. If today I run a query that does a seek, it takes up, let's say, 1 DTU. I pay $15 for the month. If tomorrow I run a query that does a scan and it takes up 10 DTU. I pay $15 for the month. Scans or seeks, nothing changed. If I don't use any DTU, the cost is $15/month. And remember, the DTU is a transaction limit, like the size or number of cores (vCore is more precise in this regard). So, even if one query uses all 10 DTU, that doesn't mean no more queries can run. It's a DTU limit, not a bucket. I can use no more than 10 DTU at a time, but I can use 10 DTU, all day long, all month long, scan, seeks, whatever, my cost will be $15 for that month. Not counting, of course, storage.

    Yep, and this is the nice thing about it. You get all of these statistics and whatnot built in to understand what's really going on with those queries to help you determine if you really need to bump up the DTU, bump it down, or keep it the same. The same applies for queries where you can understand long running ones and maybe make optimization changes or like I said previously, have it make recommendations and even deploy it automatically to help you save time.

    This is also the same for Azure Data Warehouse. You're paying for DWU's per hour. The lowest is around 100 DWU for $1.50 a hour. You can query it all you like for that entire hour. It will only cost you that $1.50 per hour. Now, your query may run dirt slow because there is not enough juice. Thus, in this case, which is more to your point, will take longer to run and thus, cost your more money to run the query even if it's optimized completely. But, if it runs within a hour, run as many as you like, still $1.50 per hour.

    On the other hand, you have products like Azure Data Lake Store, which is billed based on storage, reads, and writes of the data. Then you have a paired product like Azure Data Lake Analytics that sits on top of the store, which charges you per minute to run the query. It also has it's own compute/parallelism type metric like DTU/DWU, but it's called AU. Thus, if you have it set to an AU that costs $1.00 per minute, you pay for however minutes it costs to run the query from the Azure Data Lake Analytics product + however much reading and writing you did from the Azure Data Lake Store. Get's pretty crazy to calculate the costs there for a single BAD query. But in this case, you want to reduce what you read and write to save costs. The only way to do that is to ensure you build your data smartly on the data lake store to begin with.

  • The only way poor code in Azure makes you pay more is if you need to change your DTU level up to account for more workload. As Grant and xsevensinzx mentioned, this is a budget. If you have more queries in your workload that can't run because of your DTU cap, then yes, poor code might be more expensive because you'd decide to increase the DTU limit. If you don't have other items that need to run, or you accept slower performance, no change in cost from poor code.

    In the real world, we might need to increase the DTUs because clients and management want response. In that case, we'll likely increase to handle the capacity of bad code and pay more. Where we might look good is if we actually then write better code and can scale back the DTU limit.

  • thanks a lot to  everyone for answers and clarification regarding pricing model and ideas to transfer reporting part to Azure data warehouse.

Viewing 3 posts - 16 through 17 (of 17 total)

You must be logged in to reply to this topic. Login to reply