Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««12345

Capturing The Error Description In A Stored Procedure Expand / Collapse
Author
Message
Posted Tuesday, July 22, 2008 12:50 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 8:40 AM
Points: 238, Visits: 492
Zendada, thank you for all the suggestions.

I think I figured it out.

Here is what happened:

I had that DTS package running on my local server. Everything ran fine, data was loaded into excel, and sent out properly.

I was then asked to implement the same thing on the production server. I did. But when I manually ran the DTS package on the production server, for some reason, the data never changed there [which is why i made the post]. I then tried to delete the data, which also failed, as I specified ... or did it?

I looked back at my LOCALfolder and saw that the data was deleted there ... on my LOCAL server.

I then went back to the Production server, scheduled the 'delete table' DTS package as a job, and ran the job ... and voila ... the data got deleted on my Production server Excel spreadsheet.

So, it seems, that when I ran the PRODUCTION DTS manually (while being on my local machine), the DTS was looking at the excel spreadsheet on my local drive [locations between production and local are identical, which is why i did not catch this problem earlier].

So, I'm assuming that if I would login to the production server and run the DTS from there it would work too ... just as scheduling and running the job ...
Post #538768
Posted Friday, July 25, 2008 1:38 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 8:40 AM
Points: 238, Visits: 492
Guys,

I have noticed a few things happening with the excel spreadsheet, as I am using this approach:

1. The excel file is increasing in size, even though the data inserted does not grow in size (as if so
2. If I inserted 1000 rows yesterday, and today i run the DTS package, which clears everything out and inserts 200 rows, I get 800 empty rows after it, with the size of the Excel spreadsheet either same as 1000 row one or even more
3. Sometimes empty rows are just added on and on every time I run this report

Has anyone encountered any of these problems and could point to a solution?

Thanks a lot!
Post #541267
Posted Friday, July 25, 2008 1:59 PM


SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Thursday, July 3, 2014 10:07 AM
Points: 434, Visits: 272
sql_er (7/25/2008)
Guys,

I have noticed a few things happening with the excel spreadsheet, as I am using this approach:

1. The excel file is increasing in size, even though the data inserted does not grow in size (as if so
2. If I inserted 1000 rows yesterday, and today i run the DTS package, which clears everything out and inserts 200 rows, I get 800 empty rows after it, with the size of the Excel spreadsheet either same as 1000 row one or even more
3. Sometimes empty rows are just added on and on every time I run this report

Has anyone encountered any of these problems and could point to a solution?

Thanks a lot!


I recall that being one of the issues that led me to my approach as I described earlier in this thread. I think the fix though, was to manually name the range with the table name with only the field names in the first row, and ranging rows 1 and 2. Then using a transformation to fill the excel table after the delete statement. Something like that. Try my approach instead. It's much cooler.


ZenDada
Post #541273
Posted Friday, July 25, 2008 6:26 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 8:45 AM
Points: 5,603, Visits: 25,007
sql_er this may explain your problem

When you save a workbook, Microsoft Office Excel stores only the part of each worksheet that contains data or formatting. Empty cells may contain formatting that causes the last cell in a row or column to fall outside of the range of cells that contains data. This causes the file size of the workbook to be larger than necessary and may result in more printed pages when you print the worksheet or workbook.

To avoid these issues, you can locate the last cell that contains data or formatting on a worksheet, and then reset that last cell by clearing all of the formatting that may be applied in empty rows or columns between the data and the last cell.

Let's say you enter any text or number in cell A10 and then enter a valid date in cell A20 of a new Worksheet. Now delete the date in cell A20 and save. This is because entering a date in A20 has caused Excel to automatically format the cell from "General" to a Date format.

Also refer to:
http://support.microsoft.com/?id=244435


If everything seems to be going well, you have obviously overlooked something.

Ron

Please help us, help you -before posting a question please read

Before posting a performance problem please read
Post #541335
Posted Monday, July 28, 2008 2:00 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Yesterday @ 8:40 AM
Points: 238, Visits: 492
Zendada, and Bitbucket - Thank you for the information.

I used the following (as suggested) to fix the problem:

Method 1: Manually delete excess rows and columns
To reset the last cell by manually deleting excess rows and columns, follow these steps:
1. Select all columns to the right of the last column that contains data by selecting the appropriate column headings.

Tip One way to do this is to press F5 and type the appropriate reference. For example, type F:IV for columns, or type 5:65536 for rows.
2. On the Edit menu, click Clear All.

Note If this step does not clear all the excess formatting, you may have to right-click the columns and then click Delete to fully remove the cells. If you delete cells that are referenced by formulas, the reference changes to "#REF!." Therefore, we recommend that you create a backup of the original file and then check your formulas for "#REF!" after you follow these steps.
3. Repeat steps 1 and 2 for the rows that are under the last row that contains data.
4. Save the file.
5. To continue working in the file, close and then reopen the file.


It would be great if this approach could be automated though ...

Thanks!
Post #542181
Posted Thursday, March 12, 2009 2:44 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, January 30, 2013 2:09 PM
Points: 2, Visits: 35
So, I set it up the way originally laid out in this article, then the bean counters wanted date specific files. Well, I found you post and it looks great, but I am a little lost. Currently, I create a 'table' for the excel file, dump and re-create daily. Your suggestion is to create the path in an ActiveX object (like I could pre-define most of the path and then just add in the date part?), how, then do I create the file from that template? This is all a bit new to me, so sorry if I seem obtuse, just not sure how to bring this together. It sounds great, looks like it is exactly what I am looking for, jsut not quite sure how to implement it, so any further help you can give me would be great.

Willie
ZenDada (5/2/2007)


If you are using a transformation to fill your spreadsheet:

Create an Excel template that you always use for the destination. (Just the field names on the worksheet.)

Upstream of the transform, create an ActiveX. Write some VB code that creates the path on the fly based on date, and assign the path to a local var. Copythe template workbook to that path using theFileSystemObject. Assign the local var to a global string var.

After that, create Dynamic Properties Task. Assign the global var to the destination of the transformation.

After that comes the transformation.

If you have the Excel object model on the server, you can do your copy, move with that in the ActiveX script. But it's bloaty. Use the FSO.

If you use an Excel template like this, you won't need to clear the 'table'. Just leave the template empty, and dump your transform to the worksheet by name. You may find that you need to cast some of your fields to varchar to make them look pretty in the output.

Voila!
Post #674731
Posted Friday, March 13, 2009 2:56 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Monday, July 21, 2014 3:49 AM
Points: 292, Visits: 661
Willie,

See if this helps - http://www.sqlservercentral.com/Forums/Topic93756-9-1.aspx



Post #674983
Posted Friday, March 13, 2009 10:09 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, January 30, 2013 2:09 PM
Points: 2, Visits: 35
So, I can get the whole drop, recreate and fill the table thing figured out, but that jsut uses the same table over and over again. The part I can't see how to do is name the table like mmddyysales.xls where (obviously) mmddyy (or yymmdd, it's the data, not the format that I am looking for) is the day for which I am running the report. Thanks for the help!

Willie
Post #675369
Posted Wednesday, February 10, 2010 11:18 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, February 10, 2010 11:44 PM
Points: 1, Visits: 2
Hi,

Very nice article. Thanks.
But, what if I need to maintain the headers in bold and only delete/insert the rows every time.

Is there any way to do that? Or is there any way to make table headers bold after creating it?

Thanks,
Varun/-
Post #863853
Posted Thursday, February 11, 2010 12:49 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Monday, July 21, 2014 3:49 AM
Points: 292, Visits: 661
Ther'll probably be a way to include the headers in SQL but not to make them bold. For that I would suggest a script that does that automatically when the book is opened.


Post #863882
« Prev Topic | Next Topic »

Add to briefcase «««12345

Permissions Expand / Collapse