Viewing 15 posts - 2,521 through 2,535 (of 2,897 total)
I think the reason we needed a "push" subscription was that although the ports were open, our remote web server (publisher) would not allow requests from other servers, it had...
June 21, 2006 at 9:36 am
Those options refer to how the existing table and data at the subscriber are treated when the next snapshot runs. Depending on you requirements, you will favor one of those...
June 21, 2006 at 8:40 am
Then your recovery model is probably set to FULL, meaning you are logging all those transaction. You can change it to SIMPLE for just that database, provided you do not...
June 21, 2006 at 7:32 am
In Enterprise Manager, select Properties for the publication, then Snapshot Location tab. Uncheck "Default location" and enter your preferred snapshot location in the box below.
June 20, 2006 at 2:41 pm
Claudia, did you determine what your recovery model is ? (full, simple or bulk-logged). Our subscription databases are simple because we don't need to log all the transactions that occur...
June 20, 2006 at 2:38 pm
Not sure if this applies ... but ...
When I replicated large amounts of data, I had to use a different Distribution Agent Profile. For example, I created a profile with:...
June 20, 2006 at 2:34 pm
Are the servers in different domains ? If so, you might need to create a "push" subscription on the same server as the publisher instead of a "puul" subscription at...
June 20, 2006 at 2:25 pm
These are my thoughts on this ....... Let me know if I'm wrong:
Disabling DELETEs may give you unexpected results. For instance if the primary key is UPDATED at the publisher,...
June 16, 2006 at 1:22 pm
DBCC SHRINKFILE with the EMPTYFILE sounds like what I want, but it's not working. I created additional files, both in the PRIMARY filegroup and SECONDARY. I ran...
June 8, 2006 at 2:11 pm
Another "gotcha" could be IF you reinitialize the subscription for some reason. Then when the data from the publisher is re-snapshotted and sent to the subscriber, it will wipe out the current...
May 26, 2006 at 12:58 pm
Sometimes the Job History doesn't give the full story. In addition, I like to write out a log by going into each job step. then the "advanced" tab and setting...
May 22, 2006 at 9:04 am
You might want to run SQL Server 2005 Upgrade Advisor against your 2000 database to find out what areas need fixing before it gets converted to 2005. I just downloaded...
May 17, 2006 at 9:46 am
I have mine set to 50,000 and 1,000. The default settings are too low for us. I doubt it affects performance much.
May 17, 2006 at 9:20 am
And do you have a full backup & uninterrupted transaction log backups from before the problem occured ?
May 17, 2006 at 9:07 am
Sorry, I haven't used merge replication.
May 17, 2006 at 8:56 am
Viewing 15 posts - 2,521 through 2,535 (of 2,897 total)