The Case of the Shrinking CFO, err Database
Shrink SQL Server databases quickly and with virtually no contention.
2017-12-25 (first published: 2015-08-17)
4,349 reads
Shrink SQL Server databases quickly and with virtually no contention.
2017-12-25 (first published: 2015-08-17)
4,349 reads
How to delete millions of rows with virtually no contention.
2015-09-04 (first published: 2013-03-06)
28,345 reads
A SQL Server migration with minimal business impact while synchronizing schema and data.
2013-09-23
3,373 reads
Your production SQL Server transactional replication just failed and the business impact is critical. How do you get replication restored in minutes?
2013-05-30
8,521 reads
This article will show you one way to quickly restore SQL Server replication with huge tables.
2012-05-18
7,709 reads
How do you delete millions of rows with minimal impact to the business? This article gives you a way to accomplish the removal of old data.
2012-03-06
15,478 reads
By Chris Yates
There was a time when the Chief Data Officer lived in the shadows of...
By Rayis Imayev
"But I don’t want to go among mad people," Alice remarked."Oh, you can’t help...
By Steve Jones
I saw some good reviews of the small gemma3 model in a few places...
Comments posted to this topic are about the item Create an HTML Report on...
Comments posted to this topic are about the item We Should Demand Better
Comments posted to this topic are about the item Estimated Rows
I have two calls to the GENERATE_SERIES TVF in this code:
SELECT TOP 10 gs.value FROM GENERATE_SERIES(1, 10) AS gs ORDER BY NEWID () OPTION (RECOMPILE); go DECLARE @a int = 10; SELECT TOP (@a) gs.value FROM GENERATE_SERIES(1, @a) AS gs ORDER BY NEWID () OPTION (RECOMPILE);In the actual query plans, what is the estimated number of rows for each batch? See possible answers