April 15, 2022 at 7:44 pm
I'm trying to copy fixed width files from on prem file share to AZ Blob storage. I'm getting intermittent success, but not what I'm looking for. I used the copy task. When it works, it copies the file, but each row has double quotes around it. I can run the same pipeline 2 or 3 times and it works, then it will timeout. The timeouts I can figure out, it's really how to copy the file from on-prem through the shir to AZ Blob Storage. I can't use the Azure IR yet, since it hasn't gotten to the cloud yet.
For better, quicker answers, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 16, 2022 at 8:10 pm
Thanks for posting your issue and hopefully someone will answer soon.
This is an automated bump to increase visibility of your question.
April 26, 2022 at 5:40 pm
We need more information to try and help. Can you post screenshots of your copy activity and its properties, as well as the current and expected output?
April 14, 2023 at 4:39 pm
Hi all,
I'm on my way to creating a new pipeline on our Azure stack that is supposed to read Excel sheets located at Google Drive partitions.
We're using Data Factory to read those files, but they don't have a connection directly with Google Drive.
So, I'm assuming I have basically two options
Transfer all those files to SharePoint manually or ask clients to use it instead of Gdrive (not nice)
Create an automation or pipeline at GCP that extract files from GDrive and sends to Google Cloud Storage (data factory has built-in connection with it)
How would you guys handle this scenario? There are more options available?
Thanks in advance!
April 14, 2023 at 6:30 pm
Nothing like a bit of PPTS. Can't help but watching.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 15, 2023 at 4:24 pm
This was removed by the editor as SPAM
April 15, 2023 at 5:09 pm
jabujakan wrote:Hi all,
I'm on my way to creating a new pipeline on our Azure stack that is supposed to read Excel sheets located at Google Drive partitions.
We're using Data Factory to read those files, but they don't have a connection directly with Google Drive.
So, I'm assuming I have basically two options
Transfer all those files to SharePoint manually or ask clients to use it instead of Gdrive (not nice) Create an automation or pipeline at GCP that extract files from GDrive and sends to Google Cloud Storage (data factory has built-in connection with it) How would you guys handle this scenario? There are more options available? SPAM LINK REMOVED Thanks in advance!
I got this,...
Nope... I've got you. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply