Stored procedure to archive data in a table into csv file

  • I would like to write a stored procedure/sql script that archives data in a table into a csv file basing on a where condition (where the add_Datetime = todays's date)

    Archive data in table 'x' into a '.csv' file (on a network drive) where Add_dateTime = current date

    Appreciate any help!

  • I like to use BCP:

    declare @cmd varchar(1000)

    set @cmd =

    'bcp "select Col1, Col2, Col3 from x where Add_dateTime = CONVERT(VARCHAR(10),getdate(),120)" queryout "F:\output.csv" -t, -T -c'

    exec xp_cmdshell @cmd

    Then wrap that in a proc.

    _________________________________
    seth delconte
    http://sqlkeys.com

  • Thanks for the response. I had to slighly modify the query to yield today's results as following:

    'bcp "select Col1, Col2, Col3 from x where DATEDIFF(day, add_datetime, GETDATE()) = 0" queryout "F:\output.csv" -t, -T -c'

    if I run the following command from the commandline it works fine.

    bcp "select Col1, Col2, Col3 from x where DATEDIFF(day, add_datetime, GETDATE()) = 0" queryout "F:\output.csv" -t, -T -c

    But if I have to declare variables(since I want to dynamically append the date to the outputfilename) and then execute it using 'exec', it fails with the error

    SQLState = S1000, nativeError = 0

    Error = [Microsoft][SQL Native Client]Unable to open BCP host data-file

    I tried running it from SQL server Management studio and as a batchfile.. I see the same error

    Would appreciate anyputs on what I am doing wrong and how to set it right?

  • Annee (3/12/2012)


    But if I have to declare variables(since I want to dynamically append the date to the outputfilename) and then execute it using 'exec', it fails with the error

    SQLState = S1000, nativeError = 0

    Error = [Microsoft][SQL Native Client]Unable to open BCP host data-file

    Can I see your complete SQL Script including variable declaration? It sounds like it's probably a quote placement issue...

    _________________________________
    seth delconte
    http://sqlkeys.com

  • This is what I am trying to do and I get the above mentioned error:

    DECLARE @FileName varchar(50),

    @bcpCommand varchar(2000)

    SET @FileName = REPLACE('c:\\ArchiveDocumentHold_'+CONVERT(char(8),GETDATE(),1)+'.csv','/','-')

    SET @bcpCommand = 'bcp "select * from DLSData.dbo.DocumentHold where DATEDIFF(day, adddatetime, GETDATE()) = 0" queryout "'

    SET @bcpCommand = @bcpCommand + @FileName + '" -t, -T -c'

    EXEC xp_cmdshell @bcpCommand

    I tried to execute it by simplifying it as below, but still see the same error

    DECLARE @bcpCommand varchar(2000)

    SET @bcpCommand = 'bcp "select * from DGSData.dbo.DocumentHold where DATEDIFF(day, adddatetime, GETDATE()) = 0" queryout " C:\\ArchiveOutput.csv" -t, -T -c'

    EXEC xp_cmdshell @bcpCommand

  • Annee (3/12/2012)


    DECLARE @bcpCommand varchar(2000)

    SET @bcpCommand = 'bcp "select * from DGSData.dbo.DocumentHold where DATEDIFF(day, adddatetime, GETDATE()) = 0" queryout " C:\\ArchiveOutput.csv" -t, -T -c'

    EXEC xp_cmdshell @bcpCommand

    The problem I see with this statement is a space at the beginning of the path string: " C:\\ArchiveOutput.csv"

    _________________________________
    seth delconte
    http://sqlkeys.com

  • I just tried replicating the issue from your first query (modifed to use AdventureWorks DB) and it output 19614 records to a .csv:

    DECLARE @FileName varchar(50),

    @bcpCommand varchar(2000)

    SET @FileName = REPLACE('c:\\ArchiveDocumentHold_'+CONVERT(char(8),GETDATE(),1)+'.csv','/','-')

    SET @bcpCommand = 'bcp "select * from AdventureWorks.Person.Address where DATEDIFF(day, modifieddate, GETDATE()) > 0" queryout "'

    SET @bcpCommand = @bcpCommand + @FileName + '" -t, -T -c'

    EXEC xp_cmdshell @bcpCommand

    output

    ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    NULL

    Starting copy...

    1000 rows successfully bulk-copied to host-file. Total received: 1000

    1000 rows successfully bulk-copied to host-file. Total received: 2000

    1000 rows successfully bulk-copied to host-file. Total received: 3000

    1000 rows successfully bulk-copied to host-file. Total received: 4000

    1000 rows successfully bulk-copied to host-file. Total received: 5000

    1000 rows successfully bulk-copied to host-file. Total received: 6000

    1000 rows successfully bulk-copied to host-file. Total received: 7000

    1000 rows successfully bulk-copied to host-file. Total received: 8000

    1000 rows successfully bulk-copied to host-file. Total received: 9000

    1000 rows successfully bulk-copied to host-file. Total received: 10000

    1000 rows successfully bulk-copied to host-file. Total received: 11000

    1000 rows successfully bulk-copied to host-file. Total received: 12000

    1000 rows successfully bulk-copied to host-file. Total received: 13000

    1000 rows successfully bulk-copied to host-file. Total received: 14000

    1000 rows successfully bulk-copied to host-file. Total received: 15000

    1000 rows successfully bulk-copied to host-file. Total received: 16000

    1000 rows successfully bulk-copied to host-file. Total received: 17000

    1000 rows successfully bulk-copied to host-file. Total received: 18000

    1000 rows successfully bulk-copied to host-file. Total received: 19000

    NULL

    19614 rows copied.

    Network packet size (bytes): 4096

    Clock Time (ms.) Total : 93 Average : (210903.22 rows per sec.)

    NULL

    (26 row(s) affected)

    _________________________________
    seth delconte
    http://sqlkeys.com

  • That's good to know!

    Have a question! Where did you run all these statements exactly from?

    Management studio?

  • Annee (3/12/2012)


    That's good to know!

    Have a question! Where did you run all these statements exactly from?

    Management studio?

    Yes, you may have to enable the xp_cmdshell extended sproc first:

    USE

    master

    GO

    EXEC sp_configure 'show advanced options', 1

    GO

    RECONFIGURE

    WITH OVERRIDE

    GO

    EXEC sp_configure 'xp_cmdshell', 1

    GO

    RECONFIGURE

    WITH OVERRIDE

    GO

    EXEC sp_configure 'show advanced options', 0

    GO

    _________________________________
    seth delconte
    http://sqlkeys.com

  • I already configured it using GUI 'SQL Server Surface Area Configuration'

    But to make sure, I repeated it using the commands u sent.

    But no luck.. still the same error..

    Under the master db-> stored procs, should I see 'sys.xp_cmdshell'

    because I don't find it, so just wanted to clarify?

  • Annee (3/12/2012)


    I already configured it using GUI 'SQL Server Surface Area Configuration'

    But to make sure, I repeated it using the commands u sent.

    But no luck.. still the same error..

    Under the master db-> stored procs, should I see 'sys.xp_cmdshell'

    because I don't find it, so just wanted to clarify?

    You'd be getting an explicit xp_cmdshell error if that was a problem. So you tried removing the leading whitespace in your path and still the same error?

    _________________________________
    seth delconte
    http://sqlkeys.com

  • Yes! I removed the white space in the file path.

    But still see the same error.

  • Use UNC path instead and make sure the sql service account has full control on the shared folder.

    __________________________
    Allzu viel ist ungesund...

  • thanks a lot, that did the magic! Gave the SQLServiceAcount full permission on the folder and worked perfectly!

    Seth, Appreciate your time and inputs, it definetely helped pinpoint the problem!

  • When I write data into a local drive, I can successfully create the output data file as

    'ArchiveDocumentHold_03-13-12.csv'

    When I write to network drive, I am able to successfully create the ouput data file, but

    for some reason it is truncating the name of the file as below: (ofcourse it contains all the data)

    'ArchiveDocumentH'

    If I shorten the name 'ADH_03-13-12.csv', it is copying the file name successfully, but I don't understand why it has a problem with the long file names on the network drive

    (Ofcourse I can manually create a long file name in that network drive, but my bcp command is unable to). Appreciate any inputs! thanks!

Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic. Login to reply