June 20, 2011 at 3:40 pm
Hey guys,
I was wondering if you could point me to ideas/resources on this issue.
We had a database hosted internally, now we are hosting it externally at another company that also (now) does the web application end. The database must be queried using an internal database that needs cached info from the (now) external database. As you might have guessed it is now taking forever to run those queries (minutes as opposed to seconds). I understand this can most likely be resolved by re-writing the queries by our DB developer but time is an issue.
The information doesn't necessarily need to be real-time, one day off would work.
What tools/ideas would you suggest as I am a noobie and need to move quickly on this. I was thinking scripting of a backup/restore, possibly log-shipping etc.
Thanks!
June 20, 2011 at 3:43 pm
So maintaining local copies that are refreshed daily is an acceptable remedy? great.
I'd use SSIS to do a truncate and bulk load of the tables of data. how much data are we talking here?
CEWII
June 20, 2011 at 3:46 pm
The database is about ~750MB.
June 20, 2011 at 3:57 pm
And you need all the data in all the tables?
CEWII
June 20, 2011 at 4:12 pm
Good question, I'll look into which ones. Thanks!
June 21, 2011 at 4:59 am
It's less than 1gb? Why not look into using SQL Azure? Sounds like a perfect solution.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 21, 2011 at 5:17 am
June 21, 2011 at 7:51 am
Jayanth, that does sound promising. The external vendor takes full backups daily so that might be a good option. Is there a way to automate this process?
June 21, 2011 at 8:51 am
The backups can be automated using a maintenance plan for the ftp you might want to use a batch file or an ftp utility, the restore can be done using sql restore command.
FYI make sure the backups are compressed , i wud suggest u use a incremental backup plans to avoid overhead on the network.
June 22, 2011 at 9:57 am
I don't think you mentioned the eact method currently used to pull data from the remote database; only that it is "queried". The advice below assumes you're just selecting from the remote database into a local table.
If you're currently using 4 part named queries with multiple joins against a linked server connection (like the example below), then don't.
insert into schemaname.tablename ( a, b, c)
select a, b, c
from linkedservername.dbname.schemaname.tablename1
join linkedservername.dbname.schemaname.tablename2 ... etc.
Instead try using the EXEC statement with pass-through style queries. The difference is that all query processing occurs on the remote server; you pass it a SQL statement, and it returns back a resultset.
insert into schemaname.tablename ( a, b, c)
exec(
"
select a, b, c
from schemaname.tablename1
join schemaname.tablename2 ... etc.
") at MYLINKEDSERVER;
The above example would be a quick fix for making an existing distributed query perform better. However, perhaps the best method for transferring data between servers would be to re-write the process using SSIS. Then again, the simple pass-though select / insert method mentioned above may perform better than SSIS when the source is a remote database.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
Viewing 10 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply