Viewing 15 posts - 1,501 through 1,515 (of 7,614 total)
Don't drop the existing table.
Create a new, test (dummy) Quality table UNIQUEly CLUSTERED on ( Quality_Date, Quality_ID ).
CREATE TABLE dbo.Quality_Test ( ... )
CREATE UNIQUE CLUSTERED INDEX QUALITY_Test__CL ON dbo.Quality_Test (...
June 18, 2021 at 2:51 pm
And again -- and for the last time, so as not to upset everyone -- you likely need to re-cluster the Quality table to get even more performance gains, and...
June 17, 2021 at 9:09 pm
What almost always gives the best payback is to first do a review of all indexes on the table, particularly on the clustered indexes. Until you have the best clus...
June 17, 2021 at 9:07 pm
Why? If what you're saying is true, then just use the first char in the value as the category, that's less overhead than any kind of table lookup.
Because...
June 17, 2021 at 9:02 pm
According to your sample data and results, the "Answer" you marked is not correct. For example, you stated that '10000' should return a 1 -- "Anything between 00000...
June 17, 2021 at 6:51 pm
Hopefully this should improve things. As I noted before, it's very likely that re-clustering the Quality table would give you even more performance gains, as it would prevent a full...
June 17, 2021 at 6:48 pm
In the first query, you need to specify a specific db to look at, or include the db name in the results. The object_ids and index_ids are unique only within...
June 17, 2021 at 3:09 pm
According to your sample data and results, the "Answer" you marked is not correct. For example, you stated that '10000' should return a 1 -- "Anything between 00000 to 10000...
June 17, 2021 at 2:53 pm
We need to see the DDL for the two tables, including all index definitions.
Btw, if the tables really were "properly indexed", you wouldn't be having big performance issues on this...
June 16, 2021 at 4:21 pm
For the pure SQL part, technically this could theoretically perform better since it avoids using functions on the table columns:
WHERE (p.JCCo=@Company and p.Job=@Job AND Phase NOT LIKE...
June 16, 2021 at 3:00 pm
So I'm looking for the most efficient way to pull the last 30 minutes of data from the table. This converts it to date\time I want to use DATEADD(s,t_stamp/1000,'1969-12-31...
June 16, 2021 at 4:15 am
Found that I had to use '1969-12-31 20:00:00' to get the correct datetime.
Be careful with that! If you need UTC and "kludge" it this way instead, it may...
June 16, 2021 at 4:14 am
To get a UTC calc, just substitute GETUTCDATE() for GETDATE() in my code.
June 15, 2021 at 8:16 pm
Hmm, looks like UTC time.
Or something in the data -- i.e. it's not really ms since 1/1/70 -- or it's local time where the server is?!
June 15, 2021 at 8:04 pm
I put no max time, so, yes, it would give you everything from then on.
If you want to limit to the then-current time, do this:
WHERE t_stamp >= DATEDIFF_BIG(MS, '19700101', GETDATE())...
June 15, 2021 at 7:52 pm
Viewing 15 posts - 1,501 through 1,515 (of 7,614 total)