Capturing The Error Description In A Stored Procedure

  • subj: Using DTS to Generate and Email Excel Reports

    Very useful information Joe - thanks for sharing, this technique will save a lot of time for me.

     

  • This is very useful indeed.

    I have a question though.  I need to change the excel spreadsheet name every day (i.e. I want to dynamically incorporate a time stamp in its name [e.g. Report_05022007]).  Is that possible?

     

    Thanks guys!

  • If you are using a transformation to fill your spreadsheet:

    Create an Excel template that you always use for the destination. (Just the field names on the worksheet.) 

    Upstream of the transform, create an ActiveX.  Write some VB code that creates the path on the fly based on date, and assign the path to a local var.  Copy the template workbook to that path using the FileSystemObject.  Assign the local var to a global string var. 

    After that, create Dynamic Properties Task.  Assign the global var to the destination of the transformation.

    After that comes the transformation.

    If you have the Excel object model on the server, you can do your copy, move with that in the ActiveX script.  But it's bloaty.  Use the FSO.

    If you use an Excel template like this, you won't need to clear the 'table'.  Just leave the template empty, and dump your transform to the worksheet by name.  You may find that you need to cast some of your fields to varchar to make them look pretty in the output.

    Voila!

    [font="Courier New"]ZenDada[/font]

  • Guys,

    If we do not have the Microsoft Excel installed on the server where the template (created on my local machine with MS Excel installed) resides, will this approach still work?

    Thanks a lot!

  • sql_er (7/21/2008)


    Guys,

    If we do not have the Microsoft Excel installed on the server where the template (created on my local machine with MS Excel installed) resides, will this approach still work?

    Thanks a lot!

    Heck yeah, just use the FileSystemObject.

    [font="Courier New"]ZenDada[/font]

  • Zendada,

    Thank you for the reply - I am still a bit confused though.

    My question was referring to the approach used by the original article. If I follow that approach as is, with no modifications, and the excel template will reside on the server with no MS Excel installed, will this approach work?

    Thanks a lot!

  • sql_er (7/21/2008)


    Zendada,

    Thank you for the reply - I am still a bit confused though.

    My question was referring to the approach used by the original article. If I follow that approach as is, with no modifications, and the excel template will reside on the server with no MS Excel installed, will this approach work?

    Thanks a lot!

    Oh sorry, I thought you were referring to my post. I just re-read the article. I don't see anything in the article that requires the Excel object model on the server, so you don't need the excel app on it. The delete and create table statements don't require it. Make sure you use ticks and not apostraphes just as in the article. Neither does the SMTP code require Excel (instructions for that are in the article). Hope that helps.

    [font="Courier New"]ZenDada[/font]

  • Zendada,

    I don't think it is working. I tried it. Even though the DTS package executes with no errors, nothing happens - the original data remains in the excel spreadsheet.

    The puzzling part is that it executes successfully. I tried just having the DROP TABLE component to see if the contents of the Excel spreadsheet would be deleted ... although the package executed successfully, the contents were not deleted.

    Any suggestions?

    Thank you!

  • sql_er (7/22/2008)


    Zendada,

    I don't think it is working. I tried it. Even though the DTS package executes with no errors, nothing happens - the original data remains in the excel spreadsheet.

    The puzzling part is that it executes successfully. I tried just having the DROP TABLE component to see if the contents of the Excel spreadsheet would be deleted ... although the package executed successfully, the contents were not deleted.

    Any suggestions?

    Thank you!

    Hmm.... I am not a DTS programmer any more... so I am working from memory here... let's see...

    try these little debugging thingies...

    start with a fresh workbook. Make sure it has only one page in it. run the create table statement pointing at that one page. close the connection. go back and run the delete table statement.

    i think the connection object can point to either a page or a range, and that the "table" is really a contiguous range. so play with that idea. if the create table didn't work before, try creating a range now, and change the connection to this range now instead of the page.

    as i recall, the default setting for excel is not to close the connection which leaves the workbook in a read only state. so don't forget to change that setting.

    while you are messing around with this, between runs, try opening the workbook, deleting every freaking row (rightclick delete - you know what i mean) - that clears any remembered range. And if the create range thing worked for you before, create your range again with only the column headings and no, or one, row in it.

    [font="Courier New"]ZenDada[/font]

  • Oh and first thing - the data you have in there right now... how did it get there? was that with your create table statement and insert? Or did you simply export into the page? If the latter, your connection is pointing at the page, not the table (range).

    [font="Courier New"]ZenDada[/font]

  • Zendada, thank you for all the suggestions.

    I think I figured it out.

    Here is what happened:

    I had that DTS package running on my local server. Everything ran fine, data was loaded into excel, and sent out properly.

    I was then asked to implement the same thing on the production server. I did. But when I manually ran the DTS package on the production server, for some reason, the data never changed there [which is why i made the post]. I then tried to delete the data, which also failed, as I specified ... or did it?

    I looked back at my LOCALfolder and saw that the data was deleted there ... on my LOCAL server.

    I then went back to the Production server, scheduled the 'delete table' DTS package as a job, and ran the job ... and voila ... the data got deleted on my Production server Excel spreadsheet.

    So, it seems, that when I ran the PRODUCTION DTS manually (while being on my local machine), the DTS was looking at the excel spreadsheet on my local drive [locations between production and local are identical, which is why i did not catch this problem earlier].

    So, I'm assuming that if I would login to the production server and run the DTS from there it would work too ... just as scheduling and running the job ...

  • Guys,

    I have noticed a few things happening with the excel spreadsheet, as I am using this approach:

    1. The excel file is increasing in size, even though the data inserted does not grow in size (as if so

    2. If I inserted 1000 rows yesterday, and today i run the DTS package, which clears everything out and inserts 200 rows, I get 800 empty rows after it, with the size of the Excel spreadsheet either same as 1000 row one or even more

    3. Sometimes empty rows are just added on and on every time I run this report

    Has anyone encountered any of these problems and could point to a solution?

    Thanks a lot!

  • sql_er (7/25/2008)


    Guys,

    I have noticed a few things happening with the excel spreadsheet, as I am using this approach:

    1. The excel file is increasing in size, even though the data inserted does not grow in size (as if so

    2. If I inserted 1000 rows yesterday, and today i run the DTS package, which clears everything out and inserts 200 rows, I get 800 empty rows after it, with the size of the Excel spreadsheet either same as 1000 row one or even more

    3. Sometimes empty rows are just added on and on every time I run this report

    Has anyone encountered any of these problems and could point to a solution?

    Thanks a lot!

    I recall that being one of the issues that led me to my approach as I described earlier in this thread. I think the fix though, was to manually name the range with the table name with only the field names in the first row, and ranging rows 1 and 2. Then using a transformation to fill the excel table after the delete statement. Something like that. Try my approach instead. It's much cooler.

    [font="Courier New"]ZenDada[/font]

  • sql_er this may explain your problem

    When you save a workbook, Microsoft Office Excel stores only the part of each worksheet that contains data or formatting. Empty cells may contain formatting that causes the last cell in a row or column to fall outside of the range of cells that contains data. This causes the file size of the workbook to be larger than necessary and may result in more printed pages when you print the worksheet or workbook.

    To avoid these issues, you can locate the last cell that contains data or formatting on a worksheet, and then reset that last cell by clearing all of the formatting that may be applied in empty rows or columns between the data and the last cell.

    Let's say you enter any text or number in cell A10 and then enter a valid date in cell A20 of a new Worksheet. Now delete the date in cell A20 and save. This is because entering a date in A20 has caused Excel to automatically format the cell from "General" to a Date format.

    Also refer to:

    http://support.microsoft.com/?id=244435

    If everything seems to be going well, you have obviously overlooked something.

    Ron

    Please help us, help you -before posting a question please read[/url]
    Before posting a performance problem please read[/url]

  • Zendada, and Bitbucket - Thank you for the information.

    I used the following (as suggested) to fix the problem:

    Method 1: Manually delete excess rows and columns

    To reset the last cell by manually deleting excess rows and columns, follow these steps:

    1.Select all columns to the right of the last column that contains data by selecting the appropriate column headings.

    Tip One way to do this is to press F5 and type the appropriate reference. For example, type F:IV for columns, or type 5:65536 for rows.

    2.On the Edit menu, click Clear All.

    Note If this step does not clear all the excess formatting, you may have to right-click the columns and then click Delete to fully remove the cells. If you delete cells that are referenced by formulas, the reference changes to "#REF!." Therefore, we recommend that you create a backup of the original file and then check your formulas for "#REF!" after you follow these steps.

    3.Repeat steps 1 and 2 for the rows that are under the last row that contains data.

    4.Save the file.

    5.To continue working in the file, close and then reopen the file.

    It would be great if this approach could be automated though ...

    Thanks!

Viewing 15 posts - 31 through 45 (of 49 total)

You must be logged in to reply to this topic. Login to reply