Looking for a very heavy query against AdventureWorks DW database for DEMO

  • I am doing a POC on something performance analysis related, and i need to demo a very heavy resource-consuming query against any tables(s) in a demo environment (AdventureWorks or AdventureWorks DW data base or any other Sample DB).

    What is the heaviest query you have ever seen or can think of to be run for such purposes?  Querying one ore many tables, it does not matter. It just needs to be heavy enough to may be run a minute or more.

    Likes to play Chess

  • You could row multiply and summarize across large sequences using a tally function and it wouldn't matter which db instance.  These peg the cpu at 100%.  If you'd like it to run longer add some zeros.

    /* 4 seconds -- Standard S2 @50 DTU */
    with big_cte as(
    select
    n1.n,
    count(*) n2_count
    from
    dbo.fntally(1, 10000000) n1
    cross join
    dbo.fntally(1, 10000000) n2
    group by
    n1.n)
    select count(*) from big_cte;

    /* 47 seconds -- Standard S2 @50 DTU */
    with big_cte as(
    select
    n1.n,
    count(*) n2_count
    from
    dbo.fntally(1, 100000000) n1
    cross join
    dbo.fntally(1, 100000000) n2
    group by
    n1.n)
    select count(*) from big_cte;

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Its nice, but I need to use Real tables.

    Like the ATTACHED SQL. Blowing up the tables.. And then here is the plan below of what time m trying to do. This is a POC demo for a group of DBAs and product owners...

    POC PLAN FOR CCI and CCI ON IN-MEMORY TABLES

    For live demo

    DEMO 1: REGULAR TABLE (NON MEMORY OPTIMIZED)

    * Creating a test table ()  conforming to the  rules of table design that qualifies for Memory Optimized table and CCI.

    * Populating it with 10+ million rows with random values

    (I'll create a script for automating such test data entry load)

    Running a HEAVY query against the table

    (+ noting/saving/documenting (bad) stats and query plan)

    Creating a couple of regular Clustered and Non-clustered indexes

    Running same HEAVY query against the table (+ noting/documenting

    (improved) run stats and query plan)

    Dropping regular Clustered and Non-clustered indexes and creating Vertical Indexes (CCI and CI)

    Running the same HEAVY query against the table (+ noting/documenting

    (DRASTICALLY improved) stats, speed, and query plan)

    DEMO 2: MEMORY OPTIMIZED TABLE: Same steps as in DEMO 1 using the

    * Same table recreated as In-Memory and re-populated with test data

    before running the above steps 2, 3, and 4.

    * Running the same HEAVY query against the table

    (+ noting/documenting (REMARKABLY  improved) stats, speed, and query plan)

     

     

     

    Likes to play Chess

  • The attached script nicely blows up the 2 tables to 5 million rows, and to 8 GB database size.

    I need to device several examples of Bad Queries (possibly purposefully written in the worst possible way using BAD PRACTICES). In order to show a spectacular and convincing improvement in even very bad query performance AFTER creating Clustered Columnstore index and may be a non clustered CI too.

    So this  post is not about the need to use good practices or best practices. This post is about purposefully using bad practices in writing a bad-performing and cumbersome T-SQL Select query. I am kinda looking for bad boys bad boys or bad girl's kind of idea. It is like writing a creative joke in T-SQL.  SQL Stand-up comedy kind of thing.  It is the core of my idea: to show all decision making parties involved that Coliumnstore index  and especially Columnstore index on Memory Optimized Table can bring about a significant and noticeably performance improvement EVEN on badly designed queries, without even necessarily having to re-write them. Its a pure POC of Vertical Indexes.

     

     

    Likes to play Chess

  • Attaching atain as TXT

    Attachments:
    You must be logged in to view attached files.

    Likes to play Chess

  • Have you considered using a tool to simulate requests?  Some DBA's are cynical about performance claims.  Sometimes vendor examples are designed to support the solution being offered by salesmen.  I've been meaning to look into this tool for a long time but never get to it:

    https://www.hammerdb.com/

     

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • How about some nice triangular joins done inside a cursor? Jeff Moden: Look Away.


    "If I had been drinking out of that toilet, I might have been killed." -Ace Ventura

  • autoexcrement wrote:

    How about some nice triangular joins done inside a cursor? Jeff Moden: Look Away.

    Oddly enough, I was going to suggest Triangular Joins.  I decided not to because I don't believe they fit the correct use for columnstore indexes because the data is too random.  Maybe I could warp it to fit in because it sure does fit the bill for nasty "runs forever" queries.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Do you know where i can see an example of such triangular join?

     

    thanks.

    Likes to play Chess

  • https://www.sqlservercentral.com/articles/hidden-rbar-triangular-joins


    "If I had been drinking out of that toilet, I might have been killed." -Ace Ventura

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply