Passing status messages and results back from Databricks to ADF
When we use ADF to call Databricks we can pass parameters, nice. When we finish running the Databricks notebook we often want to return something back to ADF so...
2020-02-05
2 reads
When we use ADF to call Databricks we can pass parameters, nice. When we finish running the Databricks notebook we often want to return something back to ADF so...
2020-02-05
2 reads
When you use delta lake there are a couple of interesting things to note based around the fact that the data is stored in parquet files which are read-only...
2020-01-20
13 reads
When you use delta lake there are a couple of interesting things to note based around the fact that the data is stored in parquet files which are read-only...
2020-01-20
1 reads
It is a non-null constraint, not a non-ish-null constraint You are writing an ETL process, part of this process you need to import a semi-structured file (think CSV, JSON,...
2019-10-28
168 reads
It is a non-null constraint, not a non-ish-null constraint You are writing an ETL process, part of this process you need to import a semi-structured file (think CSV, JSON,...
2019-10-28
7 reads
It has been a little while but I have updated SQLCover to include a number of fixes and small features, the majority of which are improvements to the html...
2019-10-30 (first published: 2019-10-16)
407 reads
It has been a little while but I have updated SQLCover to include a number of fixes and small features, the majority of which are improvements to the html...
2019-10-16
5 reads
This is the final part in the four-part series into testing ETL pipelines, how exciting!
Part 1 - Unit Testing https://the.agilesql.club/2019/07/how-do-we-test-etl-pipelines-part-one-unit-tests/ Part 2 - Integration Testing https://the.agilesql.club/2019/08/how-do-we-prove-our-etl-processes-are-correct-how-do-we-make-sure-upstream-changes-dont-break-our-processes-and-break-our-beautiful-data/ Part 3...
2019-10-02
122 reads
This is the final part in the four-part series into testing ETL pipelines, how exciting!
Part 1 - Unit Testing https://the.agilesql.club/2019/07/how-do-we-test-etl-pipelines-part-one-unit-tests/ Part 2 - Integration Testing https://the.agilesql.club/2019/08/how-do-we-prove-our-etl-processes-are-correct-how-do-we-make-sure-upstream-changes-dont-break-our-processes-and-break-our-beautiful-data/ Part 3...
2019-10-02
6 reads
“[Error] [JvmBridge] java.sql.SQLException: No suitable driver” - unable to connect spark to Microsoft SQL Server.
In spark when you want to connect to a database you use Read() passing in...
2019-10-01
423 reads
By Ed Elliott
Running tSQLt unit tests is great from Visual Studio but my development workflow...
By James Serra
I remember a meeting where a client’s CEO leaned in and asked me, “So,...
By Brian Kelley
If you want to learn better, pause more in your learning to intentionally review.
Hello team Can anyone share popular azure SQL DBA certification exam code? and your...
Comments posted to this topic are about the item Faster Data Engineering with Python...
Comments posted to this topic are about the item Which Result II
I have this code in SQL Server 2022:
CREATE SCHEMA etl;
GO
CREATE TABLE etl.product
(
ProductID INT,
ProductName VARCHAR(100)
);
GO
INSERT etl.product
VALUES
(2, 'Bee AI Wearable');
GO
CREATE TABLE dbo.product
(
ProductID INT,
ProductName VARCHAR(100)
);
GO
INSERT dbo.product
VALUES
(1, 'Spiral College-ruled Notebook');
GO
CREATE OR ALTER PROCEDURE etl.GettheProduct
AS
BEGIN
exec('SELECT ProductName FROM product;')
END;
GO
exec etl.GettheProduct
When I execute this code as a user whose default schema is dbo and has rights to the tables and proc, what is returned? See possible answers