Importing of data from Oracle to sqlserver taking 6hrs

  • There is a job(scheduled daily) which brings in around 500,000 records from Oracle server to Sql server.

    So this job is taking about 6 hrs for completion.Initially the job was scheduled on a package (set up using import export wizard in Sqlserver).

    I tried using the linked server ,but still the query takes around the same time.

    The same query runs in Oracle server in less than 30 secs. There is no other conversion of data being done before inserting to sqlserver tables.

    There is no index or constraints on the table in sql server.

    Incidentally the 2 servers are located in 2 different regions.

    When I pinged the local server response time is 1ms and for the Oracle server response time is 145 ms.So could network connectivity be a reason?

    In this context what will be the best method (Import of data from Oracle server) for me to follow?

    Is using bcp(bcp to a .txt format) a good option? Is this utility available in Oracle? Pls help...:unsure:...I'm at crossroads

  • Network connectivity may be the problem, but not all the time.

    When you are transferring data from SQL Server to ORACLE or vice-versa, data need to convert from their type to suitable type for the destination db. Which create extra overhead than exporting data from SS to Access.

    To remove your network burden, you can use access as your in-between database. First, export data from Oracle to Access on oracle server; and then Import data from that Access db (.mdb file) to SQL Server.

    Also, do you have BLOB data type in any of your table? It will also create extra load in Import/Export from Oracle to SQL Server.

    "Don't limit your challenges, challenge your limits"

  • How is the destination in your dataflow defined, does the last couple words in it read something like Fast Load? If not could you goto the destination and screen print all available tabs.

    Also I am REALLY against using Access as an intermediary..

    CEWII

  • how long a comparable size FTP takes in between the two servers?

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.
  • Perhaps someone who knows Oracle well could advise the best way to 'bulk export' the results of a query to a file?

    (Last time I checked, this wasn't easy without third party tools, but I hope the world has moved on since then)

    If you could do that, it would separate the process nicely:

    1. Export from Oracle to a file

    2. FTP the file to the SQL Server site

    3. Use one of the many fast and free ways to bulk load SQL Server (bcp, BULK INSERT, OPENROWSET(BULK), SSIS...)

    ...making it easy to see which bit is slow, and also allowing you to restart at 2 'checkpoints' - minimizing the amount of work that needs to be done in the event of failure.

    Paul

  • Provided FTP performs well and the issue is not related to CNS(*) Paul's idea would do the trick, here is the Oracle side magic to get a csv file...

    log into sqlplus...

    set colsep ',' head off linesize 300 pagesize 0 serveroutput on

    spool /tmp/MyFile.csv

    select whatevercolumns from whatevertables where whatever condition;

    spool off

    (*) CNS = Crappy Network Syndrome

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.
  • PaulB,

    :w00t: SQLPlus? ColSep? :w00t:

    Wow that brought back some very old memories.

    Is there a way to automate that process these days?

    PaulW

  • Paul White (9/9/2009)


    Is there a way to automate that process these days?

    Oh yes, you can automate it.

    1... pack provided sql code into a .sql file

    2... write a shell script that would set environmentals and log into sqlplus then execute sql script

    3... use crontab to schedule job as needed.

    ... oops, are you also hearing those voices from the past? 😀

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.
  • PaulB (9/9/2009)


    Oh yes, you can automate it.

    1... pack provided sql code into a .sql file

    2... write a shell script that would set environmentals and log into sqlplus then execute sql script

    3... use crontab to schedule job as needed.

    ... oops, are you also hearing those voices from the past? 😀

    Awesome. It's good to hear them again!

    Crontab. Well well well 😎

  • Paul White (9/9/2009)


    Crontab. Well well well 😎

    okay, okay ... you can go with dbms_scheduler system package if you want but it will take ten times longer to deploy.

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.
  • No, no - crontab = awesomeness 🙂

  • Paul White (9/9/2009)


    No, no - crontab = awesomeness 🙂

    as a friend of mine uses to say "everything that works ends up to be based on a text file" 😀

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.
  • Seems file creation/FTP is the old school way of doing things (but probably still the fastest). From a SQL Server standpoint:

    1. What recovery type is your SQL Server database? Change it to bulk logged for the data transfer.

    2. As stated before, are there disparate datatypes between the two systems? If so, do two statements, one to bring all the 'easy' data over, then one more to get the other stuff. You'll be able to see if that's the culprit.

    3. Are you using the OPENQUERY statement?

    ----------------------------------------------------------------------------
    Sacramento SQL Server users group - http://sac.sqlpass.org
    Follow me on Twitter - @SQLDCH
    ----------------------------------------------------------------------------

    Yeah, well...The Dude abides.

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply