June 6, 2018 at 11:15 pm
Please help. I tried to check on DTA it says there is a syntax error. I couold not find what the error was.
USE [REGWMS]
GO
/****** Object: View [wh7].[vw_dw_HW_Outbound_report] Script Date: 6/6/2018 9:33:16 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
--select * from [wh7].[vw_dw_HW_Outbound_report]
CREATE View [wh7].[vw_dw_HW_Outbound_report]
AS
SELECT
Left(pd.STORERKEY,15) AS StorerKey,
CONVERT(VARCHAR(16), o.actualshipdate, 120) AS ShipmentDate,
CONVERT(VARCHAR(16), o.deliverydate, 120) AS RequestArriveDate,
left(o.externorderkey,60) AS AppNo,
LEFT(o.susr2,30) AS SiteID,
left(od.susr5,30) AS ProjectName,
left(l.lottable10,50) AS ContractNo,
left(l.lottable07,50) AS CLNo,
left(l.lottable08,50) AS CaseNo,
left(pd.SKU,50) as Item,
left(pd.ID,18) as BoxName,
LEFT(s.descr,45) as Descr,
case when S.icdflag = 1 then 'Y' else 'N' end icdflag ,
--sum(pd.Qty) as Qty,
sum(case when t.oother1 is null then (pd.QTY) else 1 end) AS QTY,
left(od.packkey,50) as UOM,
left(t.oother1,30) AS SerialNo,
left(o.susr3,30) as ReceiverName,
left(o.susr4,30) as ReceiverTelNo,
--o.c_Address1 + ' '+o.c_Address2 + ' '+o.c_Address3 + ' '+o.c_city+' '+c_country as Receiver,
LEFT(o.c_company,30) RECEIVER,
left(o.susr2,30) as CARGOTRACKINGNO,
left(od.susr1,30) AS PickType,
CONVERT(VARCHAR(16), o.adddate, 120) AS DBAddDate ,
cast( convert(varchar, o.actualshipdate,103) as datetime) billingrecdate,
left(PD.DROPID,30) as DROPID,
PD.pickdate,
max(o.EDITDATE) as OrderClosedDate
FROM
pickdetail AS pd with(NOLOCK) inner join
orderdetail AS od with(NOLOCK) on pd.orderkey = od.orderkey and pd.orderlinenumber = od.orderlinenumber inner join
orders AS o on o.orderkey = pd.orderkey inner join
LOTATTRIBUTE AS l with(NOLOCK) ON pd.LOT = l.LOT and pd.storerkey = l.storerkey and pd.sku = l.sku INNER JOIN
RECEIPT AS r on r.receiptkey = l.lottable02 and r.storerkey = l.storerkey inner join
SKU AS s ON l.STORERKEY = s.STORERKEY AND l.SKU = s.SKU left join
( select pd.pickdetailkey, pd.orderkey,pd.orderlinenumber,pd.sku,pd.ID,pd.Lot,ld.serialkey,max(ld.oother1) oother1
from
pickdetail pd with(NOLOCK) inner join
lotxidheader lh on lh.ioflag = 'o' and pd.storerkey = lh.storerkey and pd.orderkey = lh.sourcekey and pd.orderlinenumber = lh.sourcelinenumber and
pd.sku = lh.sku and pd.lot = lh.lot and pd.ID = lh.ID and pd.pickdetailkey = lh.pickdetailkey inner join
lotxiddetail ld with(NOLOCK) on ld.lotxidkey = lh.lotxidkey
where pd.qty > 0 and
pd.status > 5
--and pd.orderkey = '0000007533'
group by pd.pickdetailkey, pd.orderkey,pd.orderlinenumber,pd.sku,pd.ID,pd.Lot,ld.serialkey
) t on t.pickdetailkey = pd.pickdetailkey and t.orderkey = pd.orderkey and t.orderlinenumber = pd.orderlinenumber
WHERE
pd.qty > 0 and
pd.status > 5 and
o.actualshipdate > getdate() - 120
GROUP BY
pd.STORERKEY ,
o.actualshipdate ,
o.deliverydate ,
o.externorderkey ,
o.susr2 ,
od.susr5 ,
l.lottable10 ,
l.lottable07 ,
l.lottable08 ,
pd.SKU ,
pd.ID ,
s.descr ,
s.icdflag,
od.packkey ,
o.susr3 ,
o.susr4 ,
o.c_company ,
o.susr2 ,
t.oother1,
od.susr1,
o.adddate,
left(PD.DROPID,30) ,
PD.pickdate
GO
im using this query and it takes 30 mins to get the results
SELECT StorerKey, ShipmentDate, RequestArriveDate, AppNo, SiteID, ProjectName, ContractNo, CLNo, CaseNo, Item, BoxName, Descr, QTY, UOM, SerialNo,
ReceiverName, ReceiverTelNo, RECEIVER, CARGOTRACKINGNO,PickType,DBAddDate,icdflag,
'20180302' as FromDate, '20180303' as ToDate
FROM [wh7].[vw_dw_HW_Outbound_report]
where
STORERKEY ='HW'
and cast( billingrecdate as datetime) between '20180302' and '20180303'
June 6, 2018 at 11:22 pm
also im getting this error.
when executing this
Msg 242, Level 16, State 3, Line 1
The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.
June 7, 2018 at 1:55 am
So what is your problem here - the syntax error, the conversion error, or the performance? Do you have any control over the code in that view? If so, start with this line: cast( convert(varchar, o.actualshipdate,103) as datetime) billingrecdate. Why are you converting to varchar and then to datetime? What is the data type of the actualshipdate column in the orders table? And ditch the NOLCOK hints, unless you're comfortable with getting incorrect results from time to time.
John
June 7, 2018 at 5:13 am
Why all the group by statements when you're not doing any aggregation that I can see? That's going to blow performance out of the water. Especially grouping on, what, 14 columns. Indexing that should be fun (not). Ah, no, I see, one SUM operation... OK, suggestion, do the SUM in a derived table, grouping by the bare minimum so that you can actually have an index that works, then join that back out to the greater set of columns. No amount of NOLOCK hints is going to speed up what are doubtless nothing but table scans in your execution plans.
In addition to what John says, this bit of code is doing you no favors either:
cast( billingrecdate as datetime) between '20180302' and '20180303'
Fix the columns so that dates are stored as dates, or, change your code so that you deal with those strings as strings. Casting the column to another data type as part of filtering criteria will prevent statistics use, leading to scans (again, and I suspect that's all you're getting anyway).
Take a look at your execution plan to see how the optimizer is dealing with your T-SQL as written.
And finally, the DTA is evil. Don't use it.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 7, 2018 at 7:37 am
Only one comment on this one, "RE-DESIGN"
😎
There are too mane anti-patterns and bad practices in the code for even starting to comment on it. Things you should be aware of are that multiple column group by will induce a heavy pressure on the tempdb, the implicit data type conversions are not going to help and you can most likely speed this up by using temp tables if you don't have the option or capacity of redesigning.
June 8, 2018 at 1:54 pm
Thanks for your suggestion. I am new to the company and it has been there for a long time and it is working before. I am not good at query tuning. I will try to learn more about this. I have no capacity of redesigning so i wll just try to create temp tables. More suggestions. 🙂 can you suggest books to read best practices when creating query statements?
June 11, 2018 at 5:50 am
jccalmerin - Friday, June 8, 2018 1:54 PMThanks for your suggestion. I am new to the company and it has been there for a long time and it is working before. I am not good at query tuning. I will try to learn more about this. I have no capacity of redesigning so i wll just try to create temp tables. More suggestions. 🙂 can you suggest books to read best practices when creating query statements?
I would normally recommend my books available in the links below. However, both books will get a new version within a very few weeks, so I'm not sure I want to push them on you too much.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
June 12, 2018 at 11:07 am
Grant Fritchey - Monday, June 11, 2018 5:50 AMjccalmerin - Friday, June 8, 2018 1:54 PMThanks for your suggestion. I am new to the company and it has been there for a long time and it is working before. I am not good at query tuning. I will try to learn more about this. I have no capacity of redesigning so i wll just try to create temp tables. More suggestions. 🙂 can you suggest books to read best practices when creating query statements?I would normally recommend my books available in the links below. However, both books will get a new version within a very few weeks, so I'm not sure I want to push them on you too much.
I previously did a review on Grant's first edition book on execution plans. IMHO, it's a big-time "Must Own/Must Read".
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply