I have a table from a Linked Server that I'm pulling data and would like to know if there is a more efficient way based upon this table design. I need to full reload and not use the "t_stamp >= DATEDIFF_BIG(MS, '19691231 20:00:00', GETDATE()) - (1000 * 60 * 30))" in the where clause.
I know what the min and max for t_stamp is for the table, but that is such a large pull I was wondering if I could break up the select
and grab say 50,000 records between the min and max until no more records to process. The select is doing and Insert into a staging table for other processing.
Thanks, and if more info is needed ..
distinct ql.Line,ql.Tag_Description,sd.floatvalue as 'CaptureValue',DATEADD(s,t_stamp/1000,'1969-12-31 20:00:00') as 'DateRecorded'
(t_stamp >= DATEDIFF_BIG(MS, '19691231 20:00:00', GETDATE()) - (1000 * 60 * 30))
and ql.Tag_ID = sd.tagid
CREATE TABLE [dbo].[sqlt_data_1_2021_08](
[tagid] [int] NOT NULL,
[intvalue] [bigint] NULL,
[floatvalue] [float] NULL,
[stringvalue] [nvarchar](255) NULL,
[datevalue] [datetime] NULL,
[dataintegrity] [int] NULL,
[t_stamp] [bigint] NOT NULL,
PRIMARY KEY CLUSTERED
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]