Guide to enhancing privacy and addressing GDPR requirements with the Microsoft SQL platform
A new whitepaper published today gives Microsoft SQL customers technical guidance for how to approach GDPR compliance with Microsoft SQL technologies.
A new whitepaper published today gives Microsoft SQL customers technical guidance for how to approach GDPR compliance with Microsoft SQL technologies.
Scala and Apache Spark might seem an unlikely medium for implementing an ETL process, but there are reasons for considering it as an alternative. After all, many Big Data solutions are ideally suited to the preparation of data for input into a relational database, and Scala is a well thought-out and expressive language. Krzysztof Stanaszek describes some of the advantages and disadvantages of a scala-based approach to implementing and testing an ETL solution.
An example of exporting and importing table data with JSON in Azure and SQL Server 2016.
Some people will assure you that you can't do any serious statistical calculations in SQL. In the first of a series of articles, Phil factor aims to prove them wrong by explaining how easy it is to calculate Pearson's Product Moment Correlation.
SQL Server is becoming more capable all the time, requiring fewer human resources for basic management.
Partitioning data is a standard SQL Server administration practice. Partitions enable independent administration of different slices of data. When a SQL Server Analysis Services (SSAS) tabular data model is developed and processed, data is read from the source system and loaded into the tabular data model configured in In-Memory processing mode. Every time the model is processed, the entire data set may not require re-processing. Only certain slices of data containing changes may require re-processing which can be achieved by partitioning data into logical slices. In this post, Siddharth Mehta looks at how to partition tables in Tabular SSAS.
When you are doing the rapid deployment of an updated SSIS project, there are a number of things you have to check to make sure that the deployment will be successful. These will include such settings as the values in environment variables, Package parameters and project parameters. The DbFit test framework turns out to be ideal for the purpose of doing final checks as part of a deployment process, as Nat Sundar demonstrates.
In this article, I will provide a set of examples to showcase the use of OUTPUT clause in capturing the results of the updated rows into a table variable for the UPDATE statements.
PlanTrace Now Supports PostgreSQL The same plan analysis you know from...
By Steve Jones
the kinder surprise – . the point in your early adolescence when you realize...
If you’ve been following my T-SQL Snapshot Backup series, most of what I’ve covered...
Comments posted to this topic are about the item SSRS Is Dead. Here Are...
Comments posted to this topic are about the item The Distance Metric
Comments posted to this topic are about the item The New Wave of Security...
In the new VECTOR_DISTANCE() function in SQL Server 2025, the first parameter is the distance_metric. What is this?
See possible answers