Combining R and SQL Server to diagnose performance bottlenecks

  • Comments posted to this topic are about the item Combining R and SQL Server to diagnose performance bottlenecks

  • Gosh, I wish I understood that.

  • How important is the integration to the process? I'm not likely to see SQL Server 2016 in production for years. I have performance problems today that might benefit from the method.

  • Hi Craig, thank you for your response. I don't think it is really terribly difficult, but I guess the big disappointment is that it isn't a hard and fast answer 🙁 But it is the start of a more targeted investigation - specifically focusing on the worst tables.

    Assuming that there are issues with queries or index design, focus on the points that scatter out from the rest and start to look at the most expensive queries and the indexes on those tables. You will only have to do this a few times to start to see common patterns that you will quickly begin to recognise.

    Regards,

    Nick

  • Hi Robert - good news! This analysis does not need SQL Server 2016. Run the DMV query, save the results to a CSV and then use R to read the CSV and analyse it like any other data set.

    There is one big caveat here; that the analysis primarily focuses on the way that data is being read from tables, i.e. poor indexing or nasty queries. Common issues like memory, CPU, disk, recent changes in the environment etc. should be excluded as well.

    Regards,

    Nick

  • nick.dale.burns (12/7/2015)


    Hi Robert - good news! This analysis does not need SQL Server 2016. Run the DMV query, save the results to a CSV and then use R to read the CSV and analyse it like any other data set.

    There is one big caveat here; that the analysis primarily focuses on the way that data is being read from tables, i.e. poor indexing or nasty queries. Common issues like memory, CPU, disk, recent changes in the environment etc. should be excluded as well.

    Regards,

    Nick

    Great!

    Do you have some traces set up as a models for excluding those events? Should I share my attempts here?

  • Thanks for this. Very timely. I'm just barely getting started in R and this was extremely useful.

    A couple of notes though. The query you start with, it has an ORDER BY clause on a column that's not defined. I changed my query like this:

    SELECT OBJECT_NAME(ops.object_id) AS [Object Name],

    SUM(ops.range_scan_count) AS [Range Scans],

    SUM(ops.singleton_lookup_count) AS [Singleton Lookups],

    SUM(ops.row_lock_count) AS [Row Locks],

    SUM(ops.row_lock_wait_in_ms) AS [Row Lock Waits (ms)],

    SUM(ops.page_lock_count) AS [Page Locks],

    SUM(ops.page_lock_wait_in_ms) AS [Page Lock Waits (ms)],

    SUM(ops.page_io_latch_wait_in_ms) AS [Page IO Latch Wait (ms)],

    SUM(ops.row_lock_count) AS [RowCount]

    FROM sys.dm_db_index_operational_stats(NULL, NULL, NULL, NULL) AS ops

    INNER JOIN sys.indexes AS idx

    ON idx.object_id = ops.object_id

    AND idx.index_id = ops.index_id

    INNER JOIN sys.sysindexes AS sysidx

    ON idx.object_id = sysidx.id

    WHERE ops.object_id > 100

    GROUP BY ops.object_id

    ORDER BY [RowCount] DESC;

    Is that what you were originally going for, or should I just ignore the ORDER BY clause?

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Looks like this code was missing a set of quotes:

    data <- read.csv("hotspots_data.csv)

    Or am I wrong on that. I ran my local copy with closing quotes around the hotspots_data.csv file name and it worked, or did I break things?

    Just trying to understand here. It's all new to me.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • I'm getting errors from this bit. It could be that I'm using RevolutionR, or it could be Steve's formatting:

    # load GGPLOT library(ggplot2) plot.data <- data.frame(pca$x[, 1:2]) g <- ggplot(plot.data, aes(x=PC1, y=PC2)) + geom_point(colour=alpha("steelblue", 0.5), size=3) + geom_text(label=1:102, colour="darkgrey", hjust=1.5) + theme_bw() print(g)

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Ah, figured out that part of my problem is that I don't have the GGPLOT library. Off to track that down.

    Seriously, thanks again for this. You've started me exploring exactly the kind of thing I was hoping to get out of R in the first place. It's truly appreciated.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Guess I'm behind the times. I've never heard of R before this.

  • Hi!

    Can you please provide a complete R script for this for use R newbies? 🙂

    Thanks

  • I'm brand new to R and not really sure what I was doing, but I was able to make it through this with a new install of RevolutionROpen and the following adjustments:

    use the command install.packages(ggplot2) to install the ggplot2 package and library(reshape2) to load the melt function

    break the line starting "# load GGPLOT" into it's component commands and remove the alpha() function (it was coming up undefined for me)

    change label=1:102 to label=rownames(plot.data) in two places (wasn't sure what I was doing here, but it worked -- unable to get the table names though)

    changed "behavious" to "b2" in the last step

    Thank you for this enlightening example!

  • Full working code to directly query the server. Just fix the connection string for your environment 🙂

    install.packages("ggplot2")

    library(ggplot2)

    install.packages("reshape2")

    library("reshape2", lib.loc="~/R/win-library/3.2")

    install.packages("RODBC")

    library("RODBC", lib.loc="~/R/win-library/3.2")

    #load data

    query <- "select

    object_name(ops.object_id) as [Object Name]

    , sum(ops.range_scan_count) as [Range Scans]

    , sum(ops.singleton_lookup_count) as [Singleton Lookups]

    , sum(ops.row_lock_count) as [Row Locks]

    , sum(ops.row_lock_wait_in_ms) as [Row Lock Waits (ms)]

    , sum(ops.page_lock_count) as [Page Locks]

    , sum(ops.page_lock_wait_in_ms) as [Page Lock Waits (ms)]

    , sum(ops.page_io_latch_wait_in_ms) as [Page IO Latch Wait (ms)]

    from sys.dm_db_index_operational_stats(null,null,NULL,NULL) as ops

    inner join sys.indexes as idx on idx.object_id = ops.object_id and idx.index_id = ops.index_id

    inner join sys.sysindexes as sysidx on idx.object_id = sysidx.id

    where ops.object_id > 100

    group by ops.object_id"

    dbhandle <-

    odbcDriverConnect(

    'driver={SQL Server};server=server;database=database;trusted_connection=true'

    )

    data <- sqlQuery(dbhandle,query)

    tables <- data[,1]

    data <- data[,-1]

    # pca

    pca <- prcomp(data)

    summary(pca)

    plot.data <- data.frame(pca$x[,1:2])

    g <-

    ggplot(plot.data,aes(x = PC1,y = PC2)) + geom_point() + geom_text(label =

    tables)

    print(g)

    # kmeans

    clusters <- kmeans(data, 6)

    plot.data$clusters <- factor(clusters$cluster)

    g <- ggplot(plot.data, aes(x = PC1, y = PC2, colour = clusters)) +

    geom_point(size = 3) +

    geom_text(label = tables, colour = "darkgrey", hjust = 1.5) +

    theme_bw()

    print(g)

    # cluster centers (i.e. "average behaviour")

    behaviours <- data.frame(clusters$centers)

    behaviours$cluster <- 1:6

    b2 <- melt(behaviours, "cluster")

    g2 <- ggplot(b2, aes(x = variable, y = value)) +

    geom_bar(stat = "identity", fill = "steelblue") +

    facet_wrap(~ cluster) +

    theme_bw() +

    theme(axis.text.x = element_text(angle = 90))

    print(g2)

    odbcClose(dbhandle)

  • kevin 19285 (12/7/2015)


    I'm brand new to R and not really sure what I was doing, but I was able to make it through this with a new install of RevolutionROpen and the following adjustments:

    use the command install.packages(ggplot2) to install the ggplot2 package and library(reshape2) to load the melt function

    break the line starting "# load GGPLOT" into it's component commands and remove the alpha() function (it was coming up undefined for me)

    change label=1:102 to label=rownames(plot.data) in two places (wasn't sure what I was doing here, but it worked -- unable to get the table names though)

    changed "behavious" to "b2" in the last step

    Thank you for this enlightening example!

    use label=tables 😉

Viewing 15 posts - 1 through 15 (of 34 total)

You must be logged in to reply to this topic. Login to reply