External Article

Making your SQL Server database changes backward compatible: Adding a new column

As multi-tier architectures grow over time, it is often challenging to coordinate those changes across the data, logic and presentation tiers. Unless planned and implemented carefully, an act as simple as adding a column to a table can grind all of the components of your application to a halt. While some of us have comfortable 12-hour maintenance windows every weekend, many of us are bound by service level agreements that are much more strict. So we must find ways to introduce fixes and new features with zero downtime, and without requiring every single component to be refactored at the same time.

Blogs

From Data Custodian to Innovation Catalyst: The Evolving Role of the CDO

By

There was a time when the Chief Data Officer lived in the shadows of...

Down the Rabbit Hole: Dealing with Ad-Hoc Data Requests

By

"But I don’t want to go among mad people," Alice remarked."Oh, you can’t help...

Adding a Local Model to Ollama through the GUI

By

I saw some good reviews of the small gemma3 model in a few places...

Read the latest Blogs

Forums

Create an HTML Report on the Status of SQL Server Agent Jobs

By Nisarg Upadhyay

Comments posted to this topic are about the item Create an HTML Report on...

We Should Demand Better

By Steve Jones - SSC Editor

Comments posted to this topic are about the item We Should Demand Better

Estimated Rows

By Steve Jones - SSC Editor

Comments posted to this topic are about the item Estimated Rows

Visit the forum

Question of the Day

Estimated Rows

I have two calls to the GENERATE_SERIES TVF in this code:

SELECT   TOP 10 gs.value
FROM     GENERATE_SERIES(1, 10) AS gs
ORDER BY NEWID ()
OPTION (RECOMPILE);
go
DECLARE @a int = 10;
SELECT   TOP (@a) gs.value
FROM     GENERATE_SERIES(1, @a) AS gs
ORDER BY NEWID ()
OPTION (RECOMPILE);
In the actual query plans, what is the estimated number of rows for each batch?

See possible answers