Bulk Insert using UNC path of a Filetable Directory

  • Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.

    Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported or if anyone has seen this issue and resolved it.

    Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard

  • Is that the Filetable directory on the same server on a different one?

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Same server.

  • And same database?

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Same database. Running T-SQL statement directly from SSMS. Run as Windows Login and as 'sa'. Same result.

  • Can you post the code you are using? And if you are using Dynamic SQL, please post the generated SQL.


    And then again, I might be wrong ...
    David Webb

  • -----Create Schema------

    CREATE SCHEMA Master_Data;

    ------Create Table---

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    SET ANSI_PADDING ON

    GO

    CREATE TABLE [Master_Data].[File_Load](

    [Load_ID] [bigint] IDENTITY(1,1) NOT NULL,

    [File_Record] [varchar](max) NOT NULL,

    [IsError] [bit] NOT NULL,

    CONSTRAINT [PK_MMR_File_Load] PRIMARY KEY CLUSTERED

    (

    [Load_ID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

    GO

    SET ANSI_PADDING OFF

    GO

    ALTER TABLE [Master_Data].[File_Load] ADD CONSTRAINT [DF_File_Load_IsError] DEFAULT ((0)) FOR [IsError]

    GO

    -----Create view for handling of identity field------

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    CREATE VIEW [dbo].[vw_File_Load]

    AS

    SELECT File_Record

    FROM Master_Data.File_Load

    GO

    -----Sample File Data for file TestFile_20130731.txt-------

    Claus|Santa

    Fairy|Tooth

    Bunny|Easter

    -----Bulk Insert Script-------

    BULK INSERT vw_File_Load

    FROM '\\Vebasqlmmrdev01\trust\File_Table\FileTableTb\TestFile_20130731.txt'

    WITH

    (

    ROWTERMINATOR = '0x0a'

    )

    ------------------------------------

    NOTE*** If I map the UNC path on the OS side and use the mapped drive in the BULK INSERT

    statement then it works when using a windows login in SSMS. I then get into issues when trying to schedule in an SP using Dynamic SQL, mapped drives, etc... Thus why I am trying to keep the question as simple as possible: Is it possible to use the UNC path of a FileTable in a BULK INSERT statement?

  • I am able to reproduce the issue, and I note that it works if I attempt to bulk load the file from another SQL Server instance.

    But what are you trying to achieve? It appears that you want to load the entire contents of a file into a table. In that case, wouldn't it be easier to do:

    SELECT list_to_table(convert(varchar(MAX), file_stream), char(10))

    FROM yourfiletable

    Where list_to_table is a function that cracks a string into rows. See this article on my web site a for a heap of such functions: http://www.sommarskog.se/arrays-in-sql-2005.html.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Thank you Erland. I will take a look at the approach you recommended. My concern as you mention in your article is performance. Some of the files I load are larger in nature. I had tried using something similar but found the performance lacking for the large files.

    I am currently developing a process that will load each file's record into one field. (gets the file in SQL without any fuss)

    Then based on a user defined set of layouts (Using a Web UI I developed) parse the field into multiple fields (in another table). A poor man's ETL so to speak.

    Obviously I can write SSIS packages to do the same thing, but the method I have developed is:

    1. More dynamic and requires less knowledge from the end user perspective.

    2. Faster than SSIS based on what I have read and experienced.

    3. More portable, having all my code in SP's instead of SSIS packages.

    SQL Server 2012's FileTable functionality allows me to drop files in the UNC directory from an ftp process. Use T-SQL to loop through the files in the directory (FileTable) and kick off a Bulk Load for each file. Without publishing a single SSIS package.

    Ran into issues when trying to use the UNC path.

    I have a work around in place at the moment, but using the UNC path is much cleaner.

  • I think that you are correct in that BULK INSERT is more conservative on memory than a list-to-table function.

    As for the setup as such, I guess SSIS fans gringe, but since I have not come around to learn SSIS myself, I am sympathetic to the idea.

    I can't say why you get the error, but I assume that it is some internal restriction.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland,

    I decided to try your approach and in the preliminary tests the results are better than I expected.

    A simplified version of my script is as follows:

    --Beginning of sample script---

    DECLARE @File_Content varchar(MAX), @Delimiter varchar(10)

    SET @Delimiter = CHAR(10)

    SELECT @File_Content = (convert(varchar(MAX), file_stream))

    FROM [dbo].[FileTableTb]

    INSERT INTO vw_File_Load

    SELECT file_record

    FROM [dbo].[udf_text_list_to_tbl](@File_Content, @Delimiter)

    --End of Sample Script---

    -----Function (modified slightly from your website page)-------

    CREATE FUNCTION udf_text_list_to_tbl (@list varchar(MAX), @Delimiter varchar(10))

    RETURNS @tbl TABLE (file_record varchar(MAX) NOT NULL) AS

    BEGIN

    DECLARE @pos int,

    @nextpos int,

    @valuelen int

    SELECT @pos = 0, @nextpos = 1

    WHILE @nextpos > 0

    BEGIN

    SELECT @nextpos = charindex(@Delimiter, @list, @pos + 1)

    SELECT @valuelen = CASE WHEN @nextpos > 0

    THEN @nextpos

    ELSE len(@list) + 1

    END - @pos - 1

    INSERT @tbl (file_record)

    VALUES (substring(@list, @pos + 1, @valuelen))

    SELECT @pos = @nextpos

    END

    RETURN

    END

    --End of Function-----

    Numbers for comparison:

    File size - 406,243 KB

    Records - 838,695

    Timings:

    BULK LOAD - 1 minute 9 seconds

    SAMPLE SCRIPT - 1 minute 33 seconds

    In conclusion, your method simplifies the security model for the application and is cleaner than my work around. While a little slower, I am going to give it a try.

    Thanks for your assistance.

    Louis

  • That's great to hear. Note that the article suggests other alternatives that are faster than the simple version you tried.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Lewis, did you ever achieve a resolution for this? I have bumped into the same issue, and only work around is to move files to alternate location or BCP.

  • File tables don't allow local memory-mapped file access, which may explain the "request is not supported" error with BULK INSERT. I would expect a technique that performs the bulk insert remotely (e.g. BCP, SISS, etc.) will avoid this restriction with file tables.

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply