Viewing 15 posts - 1,741 through 1,755 (of 5,843 total)
I will repeat my request for sample data, and it needs to be in the form of a table with rows inserted into it. Give me that and I...
December 18, 2014 at 2:20 pm
Can't you just do something like this:
SELECT IceRink,Country_IceRink, Name, Gender, Country_Persons, Time, Distance
FROM TbExample
WHERE (Distance = '500' and Gender = 'Man' )
OR (Distance = '1000' and Gender = 'Man' )
OR...
December 18, 2014 at 1:26 pm
Please provide sample rows and sample output. I would think at a minimum you would need to include the gender and distance columns so you can specify what the...
December 18, 2014 at 12:35 pm
Don't have time to dig into it right now, but over the years and versions there have been any number of buffer-flushing bugs hit SQL Server. Some of them...
December 18, 2014 at 9:10 am
P.S. If number of tables can possibly vary i'd better opt to FULL JOIN instead of LEAD /LAG.
I don't think FULL JOIN would be required. LEAD/LAG should still work...
December 18, 2014 at 7:30 am
Look up the LAG or LEAD keywords in Books Online.
December 18, 2014 at 6:38 am
Use the right tool for the job. Sometimes that means truncating and (re)loading from scratch is best. Sometimes you don't KNOW what changes to apply so you have...
December 16, 2014 at 1:11 pm
Adding to the good stuff Grant wrote, be sure you search ALL of your "source code" for each index name you may intend to DELETE!! Failure to do so...
December 16, 2014 at 11:31 am
DB1 32.24219 67.87012 0 <-- 68%
DB2 739.4922 94.87454 0 <-- 95%
DB1 just has a tiny tlog file. It may or may not need to be bigger. 68% full...
December 16, 2014 at 6:40 am
cre8tivedaze (12/13/2014)
December 15, 2014 at 2:16 pm
Ahh, my favorite TSQL word comes into play here: CASE
Try working something like this into your query:
SUM(CASE WHEN year = 2012 then a.totalWords else 0 end) AS 2012TotalWords,
SUM(CASE WHEN...
December 13, 2014 at 6:01 am
Also, I checked the size of the transaction log files, and it's not that big. We have a transaction backup job running every 20 mins, which will shrink the log...
December 11, 2014 at 11:33 pm
Plan guides don't help with data-value-skew-induced poor performance resulting from a good plan for one input(s) being a (horribly) bad plan for other input(s) either.
I suppose we will...
December 11, 2014 at 8:04 am
One solution is to add to the particular sensitive query an "optimize for" hint and supply a known value that gives the desired plan.
Or use "optimize for unknown", so the...
December 11, 2014 at 7:10 am
1) I recommend investigating Trace Flag 2371, which lowers the row mod count trigger for auto stats updates for larger tables.
2) No matter what you do this scenario almost certainly...
December 11, 2014 at 4:54 am
Viewing 15 posts - 1,741 through 1,755 (of 5,843 total)