Viewing 15 posts - 4,081 through 4,095 (of 6,397 total)
Its basically created a concatenated string of column names as 1 row instead of multiple rows.
This may also help in understanding the query,
August 6, 2012 at 3:26 am
Well something needed that space, if you dont know what then I would recommend tracing the server, but if it only happens once in a blue moon then unless you...
August 6, 2012 at 3:20 am
If you have such a big log after 30 minutes, then change it to 15 minutes.
You can shrink the log in full recovery, but you will have to wait until...
August 6, 2012 at 3:17 am
I would recommend reading this http://www.sqlservercentral.com/articles/Administration/64582/ before you do anything.
If the log file jumped to that size, it was for a reason and would probably do it again.
As detailed already,...
August 6, 2012 at 3:13 am
Do you have any weekend processes like rebuild, reorganise indexes etc?
What is the autogrowth setting of the DB in question?
Do you keep a record of the space used on a...
August 6, 2012 at 3:09 am
Full backups do not do anything to the transaction log. Only a transaction log backup can mark the log as re-usable.
How often is the log backup taking place?
August 6, 2012 at 3:06 am
duplicate post, and from other posts comments seems like homework.
August 6, 2012 at 3:05 am
Have you checked out the Microsoft Learning site?
August 6, 2012 at 3:04 am
The only way I can think of off the top of my head if you dont want to use subscriptions, is to build your dynamic proc and format it in...
August 6, 2012 at 3:00 am
Drop and recreate the index, would be the quickest solution.
August 6, 2012 at 2:57 am
This is standard behavour, the log will become larger than what you are used to as it is not marking the log as re-usable until a log backup occurs.
How often...
August 6, 2012 at 2:54 am
Have you checked with your mail administrators that you can actually send to an outside domain?
August 6, 2012 at 2:47 am
Hi
Please can you post DDL, sample data and expected outcomes so that we can help you out further.
The second link in my signature will help if you get stuck.
Thanks
August 6, 2012 at 2:42 am
Something like this
with cte
as
(
select
car.vinno,
car.model_id,
--sales_bill.billno,
--record.recordno,
max(areaofsales.zone_id) as Zone
from
car
inner join
sales_bill
on
car.model_id = sales_bill.model_id
inner join
record
on
sales_bill.billno = record.billno
inner join
areaofsales
on
record.recordno = areaofsales.recordno
group by
car.vinno,
car.model_id
),
cte2 as
(
SELECT
ROW_NUMBER() OVER(PARTITION BY model_id ORDER BY Vinno) AS RowNum, Vinno, model_id, Zone
FROM
CTE
)
SELECT
Vinno, model_id,...
August 3, 2012 at 8:27 am
Dont you just need to do an order by vinno asc or zone_id desc
edit, didnt test with sample data, not just an order by needed
August 3, 2012 at 8:24 am
Viewing 15 posts - 4,081 through 4,095 (of 6,397 total)