Additional Articles


External Article

What are the Key DevOps Performance Metrics You Should Track?

Successful DevOps teams rely on data-driven decision-making to continuously improve software delivery and operational performance. Understanding the right DevOps performance metrics is crucial for identifying bottlenecks, improving efficiency, and maintaining high availability. Metrics provide insight into how well your team deploys software, how quickly issues are resolved, and how stable the production environment remains over time.

2025-06-16

External Article

Real-time Data Streaming in Snowflake

Real-time data ingestion has become essential for modern analytics and operational intelligence. Organizations across industries need to process data streams from IoT sensors, financial transactions, and application events with minimal latency. Snowflake offers two robust approaches to meet these real-time data needs: Snowpipe for near-real-time file-based streaming and Direct Streaming via Snowpark API for true real-time data integration.

2025-06-13

External Article

Index Scans and Table Scans

There are several things that you can do to improve performance by throwing more hardware at the problem, but usually the place you get the most benefit from is when you tune your queries. One common problem that exists is the lack of indexes or incorrect indexes and therefore SQL Server has to process more data to find the records that meet the queries criteria. These issues are known as Index Scans and Table Scans.

2025-06-06

External Article

Purging Data from a Large Table in SQL Server

Purging data from a table is a common database maintenance task to prevent it from growing too large or to stay in compliance with data retention. When dealing with small amounts of data, this can be accomplished by a simple delete with no issues; however, with larger tables, this task can be problematic. Deleting records requires a lock that can block other processes from writing or even reading the data (depending on your isolation level). In this article I will share a technique I have used to work with some very large tables.

2025-06-04

Blogs

From Data Custodian to Innovation Catalyst: The Evolving Role of the CDO

By

There was a time when the Chief Data Officer lived in the shadows of...

Down the Rabbit Hole: Dealing with Ad-Hoc Data Requests

By

"But I don’t want to go among mad people," Alice remarked."Oh, you can’t help...

Adding a Local Model to Ollama through the GUI

By

I saw some good reviews of the small gemma3 model in a few places...

Read the latest Blogs

Forums

Create an HTML Report on the Status of SQL Server Agent Jobs

By Nisarg Upadhyay

Comments posted to this topic are about the item Create an HTML Report on...

We Should Demand Better

By Steve Jones - SSC Editor

Comments posted to this topic are about the item We Should Demand Better

Estimated Rows

By Steve Jones - SSC Editor

Comments posted to this topic are about the item Estimated Rows

Visit the forum

Question of the Day

Estimated Rows

I have two calls to the GENERATE_SERIES TVF in this code:

SELECT   TOP 10 gs.value
FROM     GENERATE_SERIES(1, 10) AS gs
ORDER BY NEWID ()
OPTION (RECOMPILE);
go
DECLARE @a int = 10;
SELECT   TOP (@a) gs.value
FROM     GENERATE_SERIES(1, @a) AS gs
ORDER BY NEWID ()
OPTION (RECOMPILE);
In the actual query plans, what is the estimated number of rows for each batch?

See possible answers