Insert Into Oracle FROM Sql Server over Linked Server

  • Does anyone know how to insert into an Oracle table with timestamp columns from a SQL Server table with datetime columns? Here are the table defs (kept the table name and column names the same across both platforms:

    ORACLE:

    create table Table_Name(

    Col1 varchar2(9),

    Col2 timestamp,

    Col3 varchar2(1000),

    Col4 varchar2(40),

    Col5 timestamp,

    constraint pk_col1 unique (Col1) using index tablespace TS_INDEX_01

    )

    tablespace TS_DATA_01;

    SQL SERVER:

    CREATE TABLE [dbo].[Table_Name] (

    [Col1] [varchar] (9) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,

    [Col2] [datetime] NULL ,

    [Col3] [varchar] (1000) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,

    [Col4] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,

    [Col5] [datetime] NULL

    ) ON [PRIMARY]

    GO

    Here is what I have tried:

    Tested syntax and this select works:

    select * from openquery(linkedserver, 'select Col1,TO_CHAR(Col2, ''YYYY-MM-DD HH24:MI:SSXFF3''),Col3,Col4,TO_CHAR(Col5, ''YYYY-MM-DD HH24:MI:SSXFF3'') from Table_Name')

    So, I tried this query (two queries at same time - syntax works with tables with no timestamp/datetime fields):

    INSERT into openquery(linkedserver, 'select Col1,TO_CHAR(Col2, ''YYYY-MM-DD HH24:MI:SSXFF3''),Col3,Col4,TO_CHAR(Col5, ''YYYY-MM-DD HH24:MI:SSXFF3'') from Table_Name') #Oracle

    SELECT Col1,Col2,Col3,Col4,Col5 from Table_Name #SQL Server

    And I get this error (I can insert into the tables from SQL*Plus and QA, so neither table is read-only):

    Server: Msg 7399, Level 16, State 1, Line 1

    OLE DB provider 'OraOLEDB.Oracle' reported an error.

    [OLE/DB provider returned message: ROW-00008: Cannot update data in a read-only column]

    OLE DB error trace [OLE/DB Provider 'OraOLEDB.Oracle' IRowsetChange::InsertRow returned 0x80004005: ].

    Any assistance would be greatly appreciated!!!

    Thanks,

    Allen

  • I'm not sure whether Oracle supports explicit value insertion into a timestamp column, but SQL Server doesn't allow this.

    Try inserting a record into the Oracle table with explicit timestamp value.

    --Ramesh


  • In SQL*Plus I can insert a record with timestamp fields as follows:

    insert into Table_Name values

    ('014867514',

    TIMESTAMP'2003-04-03 08:37:43.593',

    'ADAPTER,CONNECTOR,FIBER OPTIC',

    'A00000039.jpg',

    TIMESTAMP'2007-11-27 11:20:10.867' );

    So, I am thinking there has to be some intermediary step between selecting from SQL Server and inserting into Oracle. My best guess is that SQL Server doesn't recognize the Oracle timestamp data type so it interprets it as something else. If that is the case, then I am thinking that using CONVERT might work, but I have no idea what SQL Server data type to CONVERT to.

    Anything is possible: it's just a matter of are you smart enough to figure it out. This one is testing me!

  • The SELECT inside the OPENQUERY INSERT/SELECT can only reference field names, if you use SELECT TO_CHAR(..) then that column is a function result and not a table column that can be written to. The function value columns will be read-only no matter what permissions you have on the table. But that's not your only problem.

    SQL Server timestamp columns are not datetime values, they are just integers. (Actually they are BINARY(8), but you can convert them to bigints). Every record that is inserted or updated in table with a timestamp column gets the next sequential value. It's like an IDENTITY field, but there is only one sequence per database. There is no way to turn this back into a datetime. You can compare two records and know which one is more recent, but you don't know when it happened.

    If you need an actual date&time value that can be exported to Oracle you probably need an INSTEAD OF INSERT,UPDATE trigger that sets a datetime field to GETDATE() (or CURRENT_TIMESTAMP if you want to be less SQL Server-centric).

  • Ahhhh..The light goes on! That would certainly explain the error messages I get!

    Okay, I'm thinking I can defiintely get oracle to provide the timestamp for the time the record was actually inserted into the Oracle table: that will meet the user needs. But, the other datetime field was the time the record was originally entered into the system.: the SQL server table I am working from is created from a select on another table.

    Do you know if there a way to convert that datetime value into another datatype (varchar?) in the sql server select statement I use to populate my table? Then I would need to just go from varchar (for example) to the Oracle timestamp field. Or, to reverse that logic, can I go from datetime to say varchar2 in Oracle (and then have a trigger that fires a block of PL/SQL code to cast the value as a timestamp after it arrives)?

    Just trying to play some games with the DBMS's to get them to do what we need. I'll give the above a try, but if you already have experience with it and know I'm fighting a losing battle let me know!

    Thanks, Scott!!!

  • You can convert the datetime to a varchar in YYYY-MM-DD HH:MM:SS.SSS format with CONVERT(varchar, datetime_col, 121).

    The SQL Server field could work like an Oracle timestamp if you populate the field with an INSTEAD OF trigger to make sure it is current.

    If you transfer data from Oracle to SQL Server and want to maintain the original values, you could use ALTER TABLE to disable the timestamp trigger and then reenable it after the transfer.

  • Scott,

    Thanks for all the help!!! I ended up going from the original SQL Server table to a SQL Server stage table, converting the datetime to varchar as part of the move.

    Next, went from the SQL Server Stage table to an Oracle Stage table via linked server to receive the now Varchar'ed datetime into a Varchar2 column. On insert into the Oracle Stage table, a trigger fires which converts the Varchar2 datetime data into an Oracle timestamp and inserts it into the Oracle end table.

    It's a bit convoluted, but it works, and until the Oracle ODBC driver is updated (if ever) to handle the conversion, I don't think there is another way to work around this.

    I'll probably write it all up and post at this site since I could find no other documentation for this particular problem on the web.

    Thanks again!

    Allen

  • So what did your final insert into through the linkedserver look like?

    Mark

  • The syntax fo the insert is as follows:

    insert into openquery(linkedservername, 'select Col1,Col2,Col3,col4 from Oracle_Table')

    select Col1,Col2,Col3,Col4 from SQL_Server_Table;

    The column names are the same in both of my tables.

    Now I'm working on doing a truncate of the Oracle tables from SQL Server linked server prior to the insert and also on getting the whole thing scheduled throught DTS. So far, DTS isn't liking the insert: I tried via the GUI and the Wizard and when I build the details for the Oracle side of the connection, Enterprise Manager just shuts down.

    As far as the truncate, it appears that I am going to have to call an Oracle PL/SQL package (that does the truncate) from OpenQuery. Of course, SQL Server expects a return value (not that it appears to do anything with that; it just expects it) and I'm no PL/SQL programmer, so this has been quite the learning experience.

  • I'm managing to run across DTS from 2005 to Ora 10 g.

    But, i'm getting a type error when the types match okay when moving data from Ora->MS

    "Error converting data type varchar to numeric"

    Any ideas? I believe it's connected to the first SQL In the openquery syntax

  • allen.zobian - Thursday, December 13, 2007 10:09 AM

    until the Oracle ODBC driver is updated (if ever) 

    up to date not solved! as you said....if ever

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply