I had this problem recently using OpenQuery to a linked Oracle 10g server from SQL Server 2005.
I had this error
Msg 7347, Level 16, State 1
OLE DB provider 'OraOLEDB.Oracle' for linked server '####' returned data that does not match expected data length for column '[OraOLEDB.Oracle].'####''. The (maximum) expected data length is 64, while the returned data length is 16.
I used this conversion function inside the openquery statment (on the Oracle side of things):
Put that around the columns reporting length limitations.
-- Note that To_Char does not work which is what I tried first.
Distributed queries attempt to match data types.
Too many types, not generally helpful.
I found this page of mappings more useful
Most of the time when I am using OpenQuery it is against an OLAP cube and the data retuned is Ntext (I can't find anything in BOL about that so I can't verify that conjecture). When I need to insert those into a SQL Server Varchar field I have to explicitly cast it as Varchar on the SQL Server side of things.
That didn't work in this situation with Oracle.
I took a stab in the dark that if I could get it into the mapped version of Ntext then I could do what I want at the SQL Server end. In that last link apparently NCLOB is the equivalent of Ntext so this Oracle function To_NCLOB() works, as does To_NChar() (returns the Oracle datatype Nvarchar2). I think NChar is preferable to the NCLOB conversion for some superstituous reason (smaller datatypes are cooler?).
Whacky error anyhow. I put in some hardcode strings in place of the columns and tried various lengths and found it is actually expecting exactly 64 bytes (32 nchars) despite the error saying its a maximum length. Oracle confuses me... I love SQL Server!