Additional Articles


External Article

Using the Data Profiling SQL Server Integration Services SSIS task

Have you ever had to write a bunch of TSQL to do some data analysis on the table data in your database. If you have you'll know that this can become a fairly time consuming and tedious task? SQL Server 2012 Integration Services has a feature called the Data Profiling task that can help you perform this analysis much easier and faster (this feature is also available in SQL Server 2008). This task can help you find potential issues with your existing data as well as help you become more familiar with the data in a database that you have just started managing.

2013-02-04

3,103 reads

External Article

Providing SQL Agent Job Log Data for Developers

As a production Database Administrator, I do not want to give Developers direct access to the SQL Server Agent job log especially for servers in the DMZ. Another problem we have is that when there is a lot of log data for a job, the default job log doesn't contain the full log detail and that makes it harder to troubleshoot. Most of all, we are trying not to use a different code set for deployment based on the environment. Basically we want to use the same methods to deploy our jobs to Development, Test and Production.

2013-01-29

1,816 reads

Technical Article

Beating Backup Corruption

The most critical task for all DBAs is to have a Backup and Recovery strategy that ensures, every day, that in the event of a disaster they can restore and recover any database, within acceptable limits for data loss and downtime. Even with all the required backups in place, it's easy to miss subtle failings in the overall plan that can, and eventually will, defeat your recovery plans.

You rated this post out of 5. Change rating

2013-01-28

2,889 reads

External Article

We Loaded 1TB in 30 Minutes with SSIS, and So Can You

In February 2008, Microsoft announced a record-breaking data load using Microsoft® SQL Server® Integration Services (SSIS): 1 TB of data in less than 30 minutes. That data load, using SQL Server Integration Services, was 30% faster than the previous best time using a commercial ETL tool. This paper outlines what it took: the software, hardware, and configuration used. We will describe what we did to achieve that result, and offer suggestions for how to relate these techniques to typical scenarios. Even for customers who don't have needs quite like this benchmark, such efforts can teach a lot about getting optimal performance.

2013-01-25 (first published: )

9,371 reads

Blogs

Enterprise AI Operating Rhythm – Top 5 practices for 2026

By

2025 exposed a growing gap between AI ambition and operational reality. As budgets tightened...

Stop Being Surprised by Your Azure Bill: Use Budgets

By

When organizations migrate workloads to Azure, the focus is usually on architecture, performance, and...

Resetting on the AI hype train

By

There's a great article from MIT Technology Review about resetting on the hype of...

Read the latest Blogs

Forums

The Max PK Length

By Steve Jones - SSC Editor

Comments posted to this topic are about the item The Max PK Length

My experience using the GitHub Copilot in SSMS 22

By Daniel Calbimonte

Comments posted to this topic are about the item My experience using the GitHub...

The Microsoft SQL Year in Review

By Steve Jones - SSC Editor

Comments posted to this topic are about the item The Microsoft SQL Year in...

Visit the forum

Question of the Day

The Max PK Length

If I create a multiple column Primary Key constraint, what is the most number of bytes I can include in the constraint?

See possible answers