Problems displaying this newsletter? View online.
SQL Server Central
Featured Contents
Question of the Day
Featured Script
Redgate Database Devops
The Voice of the DBA
 

Expect the Unexpected with DiRT

Disaster recovery is one of the core tasks that many DBAs think about on a regular basis. Ensuring that we can get our data back available, accessible, and intact is important. More than a few DBAs that haven't been able to recover systems find themselves seeking new employment.

That's not to say that most DBAs perform perfectly under pressure. Plenty make mistakes, and there may be times when they can't recover all data. There does seem to be a correlation between how often DBAs practice recovery skills and how well they perform in an actual emergency. I know that at a few companies, we scheduled regular disaster tests, though often with simulated recovery of a systems that didn't expect to actually take over a workload. Arguably not a good test, but better than nothing.

Google takes things a step further. They have annual, company wide, multi-day DiRT (Disaster Recovery Testing) events. These are across many departments and can be substantial in terms of the disruption that the these events cause to their infrastructure. This is a way for the various individuals responsible for parts of infrastructure can actually evaluate if they are prepared for potential issues.

If you read the article, you find that Google started small with these and progressed them to larger, more inclusive tests, like taking down a data center. They also whitelist some servers, knowing they cannot pass a test, so there is no reason to actually take them down. After all, business still needs to work.

It's good to have tests and walk through actual events, like call lists and bridges to be sure that communication and documentation work. This might be especially important when teams often expect that all their written procedures are available. I went through an audit with one company, where we failed immediately when all our DR plans were on a network share. In this simulation, we had experienced a network failure and servers had crashed. We were supposed to bring up the systems on spare hardware, but some critical documentation wasn't available without a network. We started printing things out right away so that we could continue on with the simulation (as well as have this in a binder in our office).

Not everyone can schedule large scale tests, and certainly many managers don't see the point. They'll often want to gamble that staff will "figure things out" if there is an incident. That doesn't mean that DBAs and sysadmins can afford to wait for a disaster to practice some skills. Be sure that everyone on your team can recover databases, they know where backups are (or how to determine this), and multiple people have access to resources. The last thing you want is a disaster to occur during your vacation and have managers calling you to cut short your holiday because you're the only one that knows where something is or has the authority to access a resource.

Steve Jones - SSC Editor

Join the debate, and respond to today's editorial on the forums

Redgate SQL Monitor
 
 Featured Contents

Migrating Column Level Encryption from 2008R2 to Azure SQL Database

gareth.barnes from SQLServerCentral

A migration to Azure is stuck because of an encryption change in SQL Server. Learn how to get around this by using a certificate for encryption.

UNION vs. UNION ALL in SQL Server

Additional Articles from MSSQLTips.com

Sometimes there is a need to combine data from multiple tables or views into one comprehensive dataset. This may be for like tables within the same database or maybe there is a need to combine like data across databases or even across servers. I have read about the UNION and UNION ALL commands, but how do these work and how do they differ?

Take the 2020 State of Database DevOps Survey – win a new iPad

Additional Articles from Redgate

Redgate’s 2020 State of Database DevOps Survey is open now. Share your views on Database DevOps today for a chance to win a new 65GB iPad Air. Plus, for every entry, Redgate will donate $1 to UNICEF. Take the survey now.

From the SQL Server Central Blogs - Containers: Upgrading SQL Server from 2017 to 2019 RTM

Grant Fritchey from The Scary DBA

Throughout the pre-release of SQL Server 2019, I was demoing an effectively instant, and magical, container upgrade from 2017 to 2019. However, when I finally downloaded the release bits...

From the SQL Server Central Blogs - Collecting Diagnostic data from multiple SQL Server instances with dbatools

spaghettidba from SpaghettiDBA

Keeping their SQL Server instances under control is a crucial part of the job of a DBA. SQL Server offers a wide variety of DMVs to query in order...

 

 Question of the Day

Today's question (by sergey.gigoyan):

 

Database Backup Types

Which statements are correct about the SQL Server database backup types?
  1. A Transaction Log backup contains all changes made after the previous transaction log backup.
  2. Each differential backup contains only the changes made after the previous differential backup.
  3. Each differential backup contains all changes made after the previous full backup.
  4. A copy-only full backup can be a base for differential backups.
  5. Each full backup contains only the changes made after the previous full backup.

Think you know the answer? Click here, and find out if you are right.

 

 

 Yesterday's Question of the Day (by Steve Jones - SSC Editor)

Expanding the data frame

I have this data frame of data:

>>> df
  BillMonth  BillingAmount
0       May            300
1      June            450
2      July            600
3    August            900

I have imported numpy as np and pandas as pd.

I want to convert this to daily data for each month. My goal here is to get 30 rows for each existing row and then edit the data frame later to reflect the days. Which of these lines will expand my data frame to 30 rows for each month?

Answer: daily_df = pd.concat([df]*30, ignore_index=True)

Explanation: The concat function of the pandas module will allow you to do this. When I include the existing data frame as a parameter and multiply this by 30, each row is duplicated 30 times. Ref: concat - https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.concat.html

Discuss this question and answer on the forums

 

Featured Script

A varchar(MAX) STRING_SPLIT function for SQL 2012 and above

Jonathan AC Roberts from SQLServerCentral.com

An alternative for Microsoft's STRING_SPLIT function that will work on SQL Server 2012 and higher.

IF OBJECT_ID('[dbo].[STRING_SPLIT]','IF') IS NULL BEGIN
EXEC ('CREATE FUNCTION [dbo].[STRING_SPLIT] () RETURNS TABLE AS RETURN SELECT 1 X')
END
GO
ALTER FUNCTION [dbo].[STRING_SPLIT]
(
@string nvarchar(MAX),
@separator nvarchar(MAX)
)
RETURNS TABLE WITH SCHEMABINDING
AS RETURN
WITH X(N) AS (SELECT 'Table1' FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) T(C)),
Y(N) AS (SELECT 'Table2' FROM X A1, X A2, X A3, X A4, X A5, X A6, X A7, X A8) , -- Up to 16^8 = 4 billion
T(N) AS (SELECT TOP(ISNULL(LEN(@string),0)) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) -1 N FROM Y),
Delim(Pos) AS (SELECT t.N FROM T WHERE (SUBSTRING(@string, t.N, LEN(@separator+'x')-1) LIKE @separator OR t.N = 0)),
Separated(value) AS (SELECT SUBSTRING(@string, d.Pos + LEN(@separator+'x')-1, LEAD(d.Pos,1,2147483647) OVER (ORDER BY (SELECT NULL)) - d.Pos - LEN(@separator))
FROM Delim d
WHERE @string IS NOT NULL)
SELECT s.value
FROM Separated s
WHERE s.value <> @separator
GO

More »

 

Database Pros Who Need Your Help

Here's a few of the new posts today on the forums. To see more, visit the forums.


SQL Server 2017 - Administration
restore database dialog - I have a few servers where in my restore database dialog/wizard for some reason the default path to the files for "restore as" is missing the "\" for the file path.  I have my data on a S: drive and my logs on a L: drive so in the dialog it always shows restore as […]
SQL Server 2016 - Administration
Check DB is failing - one of the job is getting failing with the Command: DBCC CHECKDB TPX WITH NO_INFOMSGS, ALL_ERRORMSGS, DATA_PURITY Msg 3314, Level 17, State 3, Server DTX002 , Line 1 During undoing of a logged operation in database 'TPX', an error occurred at log record ID (235571:128656:345). Typically, the specific failure is logged previously as an error […]
SQL Server 2016 - Development and T-SQL
Need to use Remain value in calculation - Hello community, I need to performe a calculation by Row using remain values for the same Receipt_nr --This is a Temp table for test purpose CREATE TABLE #temptab (docdata DATE, Invoice VARCHAR(29), nrInvoice INT, totalinvoice NUMERIC(14,2), totalreceived NUMERIC(14,2), Receipt_nr INT, TotalPaymetforInvoice NUMERIC(14,2), remain NUMERIC(14,2) ) INSERT INTO #temptab (docdata, Invoice, nrInvoice, totalinvoice, totalreceived, Receipt_nr, TotalPaymetforInvoice, […]
Unable to display estimated execution plan - Hi, I am getting the below error even after closing and reopening ssms and no other query running under my login however it is quite possible other users...from what I see from sp_whoisactive only 1 user and few maintenance queries are running.  However, instance does not respond slow. I want to generate estimated plan so […]
Installing Call Quality Dashboard 2019 problems - I am trying to install Call Quality Dashboard 2019 on a Windows Server 2016 Datacentre and SQL Server version 2016.  Skype for Business Server 2015 is the version of Skype we are using. However I am having problems trying to install CQD, it seems to fail on creating the agent job on the QoE archiving […]
SQL 2012 - General
Azure SQL Database - I have taken over responsibility of an Azure SQL database. I want to make a true one time backup of this database. By true I mean that if I lost the database I could restore it and it would look exactly like it did before. When I go into SQL Server Management Studio (version 15.0.18142.0) […]
How to write select statment get top highest 10000 on count from table parts.roh - I work on sql server 2012 and i need to get top 10000 revision_id and count parts related that having highest parts from parts.Rohs ? this table have one million rows of parts but parts are repeated so that i need to count parts related to every revision it then get top 10000 have counts […]
SQL Server 2012 - T-SQL
How to speed up the query speed between different server tables - There is a SQL statement to select data from 2 SQL Server tables, and the the collation of these 2 servers is different.  and I want to know below, thanks! how to speed up the query speed while the SQL statement need to access tables on 2 SQL Server server ? The 2 servers' collation […]
SQL Azure - Administration
Impersonating a server principal (login) in Azure SQL MI - Hi all, I seem to be doing something wrong or trying to do something that's not supported. I frequently use EXEC AS LOGIN = 'DOMAIN\USERNAME' to test permissions and access on on-premises SQL instances (usually combined with sys.fn_mypermissions). However this doesn't seem to work on Azure SQL Managed Instances (version at date of this post: […]
Reporting Services
Drillthrough Reports - I have created a report (report1) that I can then drill into report2 (Action > Go to Report) - On report1, I specify name of report2 and give it the parameter report2 needs. Test this out and all works fine - I can click on my "hyperlink" on report1 and it will take me to […]
Analysis Services
Partition Caching - Ahoi, from what an external consultant once told me, the cube cache is cleared once a partition is processed. This results in queries executed from these measures/partitions being slower. This is problematic since we daily process the partitions of the current year, which leads to the cache being cleared every morning. This is what i […]
Integration Services
How to achieve parallelism in BULK INSERT with OLE DB source and target objects - Hi all! I need help to get the performance up on my ETL. My data flow task is very simple in nature: OLE DB source (MS SQL Server table) points to OLE DB target (MS SQL Server table) with TABLOCK hint The target table is uncompressed, has no indices or other constraints (PKs, Uniques) -> […]
SSISDB vs Control Table for Incremental Loads - I think I understand the concept of control tables for incremental loads.  But can't the tables in SSISDB be used in a similar fashion?
Microsoft.Office.Interop.Excel Throws Exception - I am testing an SSIS package to see if we have all required software installed on the server to run SSIS jobs. Server config: Windows Server 2016, SQL Server 2016, Visual Studio 2017 (on server) Package has 1 Script Task, using VB.Net 2017, added Reference to Microsoft Excel 15.0 Object Library for Interop assembly. Script […]
Flat File connection configuration - I know I have done this in the past, but for the life of me I cannot remember how.  We have address records that are coming from running a stored procedure whereas the fields are separated by double quotes.  The issue is with any address that might contain a comma.  If there is a comma, […]
 

 

RSS FeedTwitter

This email has been sent to {email}. To be removed from this list, please click here. If you have any problems leaving the list, please contact the webmaster@sqlservercentral.com. This newsletter was sent to you because you signed up at SQLServerCentral.com.
©2019 Redgate Software Ltd, Newnham House, Cambridge Business Park, Cambridge, CB4 0WZ, United Kingdom. All rights reserved.
webmaster@sqlservercentral.com

 

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -