Revolutionizing Real-Time Data Management with SQL Server, Kafka, and Informatica
This article examines how one can structure a pipeline for processing real-time data using Kafka and Informatica.
2023-04-26
4,499 reads
This article examines how one can structure a pipeline for processing real-time data using Kafka and Informatica.
2023-04-26
4,499 reads
See a simple demonstration of how Kafka can stream events for changes to data in SQL Server tables.
2021-07-02 (first published: 2021-05-27)
15,840 reads
By James Serra
Microsoft Fabric is rapidly gaining popularity as a unified data platform, leveraging OneLake as...
By Steve Jones
I saw a post from Erin that Preview 2 is available. I’d gotten a...
By Steve Jones
Can an AI help me with some database API work? Let’s see. This is...
Hi everyone I have a query that is taking a real long time. It...
Comments posted to this topic are about the item The New Log File
Comments posted to this topic are about the item Getting Started with the Data...
I have a detached database from SQL Server 2019, called TDE_Primer. This database had a 100MB data file and a 73MB log file. The log file was lost, so I need to run this code:
USE [master] GO CREATE DATABASE [TDE_Primer] ON ( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\DATA\TDE_Primer.mdf' ) FOR ATTACH_REBUILD_LOG GOHow big is the new log file? See possible answers