Access Denied and Cannot Open the Datafile

  • I'm guessing your account has the necessary share and ntfs permissions on the remote fileshare, thats why it works through VS.

    Setting up an agent proxy would be the best idea as already suggested.

    Otherwise, check the account that is initiating the access request and ensure it has the necessary access to the remote share

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • JJ B wrote:

    Do you mean the following? : Copy the .csv files to the local hard drive on the SQL Server.  Then the SQL Server shouldn't have any trouble accessing the .csv files.

    If that's what you are thinking, I thought of the idea, but I was so sure that there was an easier answer/that it wouldn't be this complicated.  However, I certainly could! do that if I could get the following to work without an double hop (sounds like a beer problem) problem: If I could create a batch file that the SSIS job can run which copies the files from our file server to the hard drive of the SQL Server, then that may be the work-around.  I could test and see if the SSIS package would run the batch file (which would also have to be stored on the SQL Server hard drive I presume) without a permissions issue.  Maybe I should give that a try and stop trying to get the double hop issue figured out?

    Not copying the file to the server locally, although that would solve your double-hop issues, but mapping the network drive as F:/ for example.

    Kerberos is a pain, but the problem is worth solving if you need this functionality.

    There are also other possible options:

    • Create a SQL Agent job (which runs within the context of a proxy account with permissions), and give the end user the necessary permissions to execute the job...or create a stored proc to execute the job, and give the end user permissions to execute the proc.
    • As others have mentioned, you could create logic within the SSIS package with some other logic that indicates the data is ready for processing (ex. file exists in a folder; flag is set in a table, etc.), and then run the package periodically on a schedule.

    Ultimately though, if you need the behavior of manual package execution from the catalog to work for this scenario, then it'll be worth your while to get constrained Kerberos delegation in place.

  • Jeffrey: Thanks for clarifying the process you are thinking of. There's some kind of disconnect here as I'm still not fully following you. That's OK though. We can let this go as I have lots of ideas to follow up on.

    Perry: Thanks for weighing in. It's helpful to see the wisdom of the crowd so I can see how others usually handle this problem.

    Martin: There was lots of good info in your post. I appreciate the clarification on what you meant, re: mapping the network drive. I think that's brilliant, because that was the first thing that I had asked for. 😉 Result was: I can remote into our SQL Server as myself and see the "p:\" drive just fine. But the job was still failing when run on the server. I think it's because the job is running on a different account than my account, ie, the troubling double hop thing or maybe because of Jeffrey's idea of the SQL Server service account being used.

    re: Martin's idea of  getting "constrained Kerberos delegation" in place. That's a concept I'm going to look up! Sounds like it might help.

    Everyone: I've been sharing this conversation with my network administrators. Their preferred next step is to replace the accounts running the SQL Server services with GMSA accounts that can safely be allowed access to the file server. I'm now skeptical that this will work since someone on this thread had pointed out that the Catalog reports can show the account that is running the job. I see "caller" in the Catalog report, and caller is showing my account. So, it seems unlikely that it will work, but changing access for the SQL Server service accounts is the next step that we are going to try.

    If that fails, I'm going to try the idea (that several people had) of trying to use an appropriate proxy on the Agent job and then creating a stored procedure which can run the Agent job and which my user could run. I may also look into that idea of "constrained Kerberos delegation" to see if there is something that can be done to fix a double hop problem.

    Thanks everyone! I promise to get back to you on what happens. Just give me a few days to try the various ideas.

  • No shortage of opinions...eh...ideas here, so you have lots to play with 🙂

    Side-Note: In order for everyone to see a mapped drive, an administrator will have to create it and relevant permissions have to be set...as far as I recall.

  • a mapped drive is user profile specific, one user mapping H: to a share wont be the same for other users logging on.

    Use the UNC path and just ensure the account initiating the access request has the appropriate permissions

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • JJ B wrote:

    Jeffrey: Thanks for clarifying the process you are thinking of. There's some kind of disconnect here as I'm still not fully following you. That's OK though. We can let this go as I have lots of ideas to follow up on

    The issues you are dealing with all stem from the fact that you want this to be run by the user.  If your package used a foreach loop container with specific filtering set - and a file system task (or script task - which I prefer) to 'move' the processed file to a different folder (Archive, Completed, etc...) - then each time the package runs, if there are no files to be found it just exits.

    If that job were scheduled to run every 5 or 10 minutes - then the user only needs to drop the CSV file into the 'To Be Processed' folder.  The next time the package runs - it finds a file to be processed and loads the file, then moves it to the archive folder.

    To enable a user to initiate this process - you need to grant the user access to the database system with appropriate permissions.  That means SSMS (or Azure Data Studio) needs to be installed and configured for the user and training on how to run a stored procedure or initiate the package to be executed.  What happens when that user is on PTO or changes jobs and a different user takes over?  And that new user doesn't have permissions - or SSMS - and no training?

    Personally - I would work with the users to change their workflow.  Instead of having them expect the files to be loaded immediately - set the expectation that the files will be loaded that night (or early morning) and the data available the following day.  They can start work in the morning on any errors from the previous load(s) - then later that day, download any new files and place them in the folder(s) to be processed.  The files are then processed that night and errors ready to be worked the following morning.

    If there are requirements for faster turnaround - then schedule the job to run more frequently.  Letting the users know that if they place the files before the top of the hour (for example) - the files will be loaded at that time and data available once the load process has completed.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • THIS IS AN INTERIM UPDATE: I wanted to wait until I had a final answer before posting back here, but a lot of time has gone by and I can't take the next step until I get cooperation from my network administrator.  He isn't getting back to me these last few days, so I wanted to let you know what has happened so far.

    OVERVIEW:  I won't bore you with every little thing I did.  Here's the gist of the thinking now and what I have I tried so far.

    I've given up on having my staff run the SSIS Catalog job directly via t-sql code.  The t-sql code works in that it kicks off the job, but there are permissions problems (probably related to "double hop") that we just can't figure out.  🙁

    So, the next best plan was to set up the job to run on Agent and use the stored procedure: sp_start_job to run the Agent job.  I would then (theoretically) be able to let my user run sp_start_job.  The general consensus on this forum was to try a proxy account for the Agent job.  I did try the proxy and actually got the proxy account to work in the sense that:

    ..... a) I created a relevant credential and proxy off the credential.  FYI: The credential/proxy was my own account.

    ..... b) I told the Agent job to "run as" the proxy.

    ..... c) I ran the job using sp_start_job and it worked!  (At least for me.  More on that below if interested.)

    The problem is that: 1) As a best practice, we should not use my personal account and 2) Using another account is also out, because when the password for the credential/proxy account changes, the proxy fails to work.  My network administrator is strongly opposed to the proxy solution for this reason.  (Because I guess our policy is to only have accounts which regularly change passwords?)

    Instead, my network administrator wants to do this:  Have the Agent job "run as" the normal "SQL Server Agent Service Account", and change the Service Account to be a GMSA account.  A GMSA account can be set up to have permissions to access our file server AND our network administrators consider it to be a secure account.  As I understand it, a GMSA account has a very complex password that changes automatically every 30 days and no one knows the password.  I'm not arguing that this is a better solution than using the proxy account.  I'm just saying what my network admin thinks.

    One night we did a test where we changed the SQL Server Agent Service Account to the desired GMSA account and made sure that permissions were set up for the GMSA account to access the file server.  I *was* able to schedule the Agent job to run successfully with the GMSA account as the Agent Service Account.

    HOWEVER, we had changed all the SQL Server Service Accounts (for SQL Server, SISS, and Agent), not just the Agent account.  We tested my regular front-end applications after the Service Accounts were changed, and there was an initial problem for the front-end apps connecting to the database.  I got past that connection problem, but not in a way that would be acceptable for the users.  I think it is possible that all we will have to do is ask our users to reboot their PCs, but we weren't prepared to test the reboot idea out the night we did our GMSA test, so we rolled back the changes to the SQL Server Service Accounts.

    Since then, my network administrator went on vacation for a couple days.  He's back now, but he's not getting back to me so we can move forward on this project.

    Also note that there is still an issue to work through with the current chosen plan.  *I* can run sp_start_job, however, it doesn't look like normal users can run that stored procedure.  The permissions are explained by Microsoft here: ____ .  I still have to figure out what that Permissions section is saying and then implement it.

    Also note that a minor drawback to this situation is that I cannot do any troubleshooting by right-clicking on a job and running it on the server.  If I want to run it on the server, I have to schedule it through Agent or use the sp_start_job stored procedure.  It's not the end of the world, but it's a bit of a bummer as I'm used to being able to immediately run jobs with the right-click, "Start job at step..." option.

    I will come back here and let you know when I have a final solution in place.  Thanks again for everyone's help.

  • Jeffrey: Thank you for clarifying how I could change my SSIS package so that it could run via Agent every X minutes.  I understand what you are saying now.  It is not my first-choice plan since the process would not be seamless for the user, but I like having this option in my back pocket.  Thank you for the idea.

    To clarify from my end:  As I understand your comment, you are envisioning that I will be having the user run the package from SSMS.  That's not the plan at all.  The whole reason I wanted the job to be able to run via t-sql is so that the user can run the package from a button in the front-end.  That's why I liked the idea of using the SSIS Catalog as it is easy to create t-sql to run a catalog project. At the time of my initial post, I didn't know that there was also a way to run an Agent job via t-sql.  I researched how to run Agent jobs via t-sql after seeing comments on this thread and found the sp_start_job built-in stored procedure.  (Telling me that there is some way to run Agent jobs via t-sql is one way in which people on this thread have helped me.)

    I 100% totally agree with you that it is not desirable to have a regular user do anything in SSMS!  That idea is not anywhere on my hierarchy of plans.  🙂  The user must be able to run the job by clicking a button in the front-end application.

  • I think you have made this more difficult than it needs to be...

    If you create a stored procedure to run the SSIS package from the catalog - then grant execute on that procedure to the application, then I think it will work without having to do anything else.  Now that you have a GMSA running the services - and have granted the necessary OS and SQL permissions to the GMSA account, then the stored procedure will execute in the context of the SQL Server service account.

     

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Jeffrey: re: "..then the stored procedure will execute in the context of the SQL Server service account."  Thanks.  That is good to know.  I've been trying to follow the advice on this thread for using an Agent job and then running the Agent job via proxy and t-sql, but if there is a way to use t-sql while both avoiding the double-hop problem (which I have not successfully done yet when running the catalog project) and without having extra issues around permissions to run sp_start_job for my user, then that would be great.

    I'll give it a try when we get the GMSA accounts back in place for the SQL Server Service accounts (see above post on having to roll that back).   I'll let you know what happens.

  • I believe we have a solution. For those interested in the solution and also all the tests we did, here you go.

    Process Update: We got to a point where we could permanently convert the SQL Server Service Accounts to GMSA accounts. Then it was time to see what we could do with SSIS now that the Service Accounts were changed. Prior to doing the tests below, the same SSIS code, which was created in Visual Studio SSDT, was uploaded into SQL Server 2 ways:

    [1] with the old style MSDB database (by importing the package into Integrated Services server in the Stored Packages folder) and also

    [2] into the new SSIS Catalog.

    To repeat the goal: my business need is to have t-sql code that I can put into a stored procedure which will in turn run the SSIS code/package/project. The stored procedure will be called by a button on the front end which a user runs. So, the user needs to be able to run the stored procedure from the front-end application. However, I recognize that lots of people get this error (about the SSIS job not being able to access a file server) and most people do not need a user to run the SSIS code. They just need the SSIS code to run in a *scheduled* Agent job. Hence, I did all the tests listed below to help this community get a sense of what happens with various scenarios beyond just what my agency needs.

    In the following tests, I did all the tests (except the last one) as myself/my account since:

    ....1) the tests were failing on me before we changed the SQL Server Service Accounts and

    ....2) if it doesn't work for me, it certainly isn't going to work for my user, and

    ....3) some of these options are helpful for future problem solving, and I wanted to know which options would be available in the future for my own developer needs.

    The big question is, can SSIS access our file server? I tried to answer that question with these tests running as myself:

    a) Object Explorer MSDB/Stored Package/MSDB (right click and "Run Package") >>> FAILED with: "Error, to run SSIS package outside of SQL Server Data Tools, you must install Standard Edition of Integrated Services or higher." (this is not the permissions problem, but it means that I can't do this test.)

    b) Object Explorer Catalog (right click on the package in SSMS in Object explorer in the Catalog area, and choose "Execute") >>> FAILED (failed with being unable to access the file server / runs under my account / likely double hop problem)

    c) Direct Agent, MSDB (right click on Agent job which runs SSIS package stored on MSDB, and choose "Start job at step") >>> SUCCEED

    d) Direct Agent, Catalog (right click on Agent job which runs SSIS project stored on Catalog, and choose "Start job at step") >>> SUCCEED

    e) Scheduled Agent job, MSDB >>> SUCCEED

    f) Scheduled Agent job, Catalog >>> SUCCEED

    g) Direct t-sql with sp_start_job (run in SSMS in query window: EXEC msdb.dbo.sp_start_job N'MyJobName' ) >>> SUCCEED (Works when the Agent job is running either the MSDB or the Catalog version of the code--ie, this test was repeating c) and d) from above, but running the job from Microsoft's built-in stored procedure which runs Agent jobs.)

    h) Direct t-sql in SSMS query window which runs code** to run the Catalog project >>> FAILED (failed because it runs with my account and we have a double-hop problem)

    i) Stored proc containing t-sql which runs Catalog project (ie, this is test "h", but put inside a stored procedure) >>> FAILED (Actually, I thought this test succeeded the first time and ran under the GMSA account, but every subsequent time I have run it, it runs as my account and fails. I may have made a mistake on what I thought I was doing the first time.)

    j) Stored proc containing t-sql which runs sp_start_job (stored proc run in SSMS in query window - tried with both Catalog and MSDB jobs) >>> SUCCEED

    -------------------------------

    Using sp_start_job requires that the user have special permissions to run that stored procedure. The easiest solution seemed to be to go with i). However, when I ran i) from a button on my front-end application, the Catalog job ran as myself and FAILED again. 🙁 So running the Catalog project directly via t-sql code does not appear to be an option for us.

    -------------------------------

    BOTTOM LINE FOR MY BUSINESS NEED: Now that the SQL Server Service Accounts have been changed to GMSA accounts which have the permissions to access our file server: I can run the Catalog project with t-sql, but only if I run the project through an Agent job which in turn runs a stored procedure containing: sp_start_job. MSDB packages cannot be run directly through t-sql, but like the Catalog project, can be run successfully through t-sql and the built-in sp_start_job stored procedure.

    THE BIG FINALE: I set up the required permissions for my user to run sp_start_job.  I then set up the front-end application to run the stored procedure containing the sp_start_job code. It worked!  The user can run the job on-demand and the SSIS project accesses our file server without problem.

    NOTE: If someone else out there doesn't want to or can't change the SQL Server Service Accounts and using a proxy on an Agent job is an acceptable solution for your company, see an earlier post where I got the proxy on an Agent job to also work.

    .

    Interesting side note: MDSB scheduled agent job takes 7-8 seconds. The Catalog job takes 12 seconds. A few seconds does not matter to me for this job. I'm just reporting this interesting difference.  The Catalog consistently takes longer.

    .

    ** Nifty Tip: To generate the code which runs a Catalog project, you can right-click on the package in the SSMS Object Explorer. When you choose "Execute," you get a dialog box which contains a button near the top, sort of toward the left side which is labeled "Script". Click the little drop down arrow on the Script button and you will see a couple ways to ask SSMS to generate the correct t-sql code for you. 🙂

    • This reply was modified 2 years, 11 months ago by  JJ B.
    • This reply was modified 2 years, 11 months ago by  JJ B. Reason: I clarified a couple sentences and a minor formatting fix
  • THANK YOU to everyone who helped!!

    Etiquette Question: Several people helped and provided pieces of this puzzle.   I summarized the answer in my previous post, but I don't know if I should mark that post as the answer.  Is it frowned upon to mark one's own post as the answer?  I would like credit to go where it is due.  I'm just not sure how to do that in this case.  Maybe no posts should be marked as the answer???

  • Firstly, great job at taking the bits and pieces from everyone and going through the paces to make sure you get to the right solution for your specific needs. I wish every OP had a similar approach.

    Don't worry too much about marking a specific post as the answer, the thread is public for anybody to see and your feedback here is good enough.

  • One more thing you can try is to set the procedure to EXECUTE AS OWNER.  It seems the problem is user context - and maybe forcing the procedure to execute with the owner of the module will get the right context.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • re: "One more thing you can try is to set the procedure to EXECUTE AS OWNER"

    I had not heard of "EXECUTE AS OWNER" before and was intrigued.  So I researched it and then created  test stored procedure with the "EXECUTE AS OWNER" clause.  The stored procedure contained the t-sql to directly run the catalog without going through Agent.  Ie, the stored procedure was a tweak of test i) above.  Ie, the stored procedure contains code like this:

    Declare @execution_id bigint

    EXEC [SSISDB].[catalog].[create_execution] @package_name=N'MyPackageName.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'MyFolderName', @project_name=N'MyProjectName', @use32bitruntime=False, @reference_id=Null, @runinscaleout=False

    Select @execution_id

    DECLARE @var0 smallint = 1

    EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=@var0

    EXEC [SSISDB].[catalog].[start_execution] @execution_id

    When I ran the test stored procedure, I got this error:

    "The server principal "sa" is not able to access the database "SSISDB" under the current security context."

    Background: The owner of the test stored procedure is "dbo" and "dbo" is mapped to the "sa" login.  And "sa" deliberately does not have permissions to do anything.

    I could probably change the owner of the stored procedure and play with this idea until something worked, but since I have a working solution now which is 'good enough,' I can't justify spending the time to do that.

    I appreciate you telling me about 'EXECUTE AS OWNER.'  That was a hole in my SQL Server education.

Viewing 15 posts - 16 through 30 (of 31 total)

You must be logged in to reply to this topic. Login to reply