Viewing 15 posts - 571 through 585 (of 1,496 total)
>> I am considering dropping the database backups processes on the argument that we can always restore the VM and replay the transaction logs.
Are you in a position to...
March 31, 2017 at 8:47 am
I doubt the PK of...
March 27, 2017 at 10:50 am
You have not really provided enough detail.
Assuming the column to be updated is in the same row, try the following outline:
UPDATE Y
SET YourUpdateColumn = 'WhatYouWantToUpdate'
March 27, 2017 at 9:13 am
March 27, 2017 at 8:49 am
If you want an EXISTS equivalent to the IN try:
SELECT adminproduct_id,rowversionnumber,therapyadmin_id,packagedproduct_id,adminrecordmethod
,actioncode,doseamount,doseunits,dispensableproduct_id,dispensesize,routedproduct_id
,dosetext,lotnumber, expirationdate,manufacturercode
FROM hcs.adminproduct P
WHERE EXISTS
(
SELECT 1
FROM hcs.therapyadmin T
WHERE T.therapyadmin_id...
March 27, 2017 at 8:19 am
If you are on at least SQL2012, I suspect the easiest approach would be to create a sequence and then add that sequence as the default constraint on the column.
March 23, 2017 at 3:26 am
You probably need to ensure that all the filters on the outer table are in the JOIN and not the WHERE clause.
Logically the FROM clause, with JOINS, is processed...
March 22, 2017 at 6:57 am
Jon 0x7f800000 - Monday, March 6, 2017 9:47 AMWhat confuses me though
It confuses me as well! I find the SQL Server documentation...
March 6, 2017 at 2:24 pm
Sounds like a race condition on insert.
I would forget about the explicit transaction and run the following:
UPDATE t1
SET table1.column1 = l.staging_2,
table1.column2 = l.staging_3
March 6, 2017 at 9:11 am
Also, check the number of VLFs on the source db.
DBCC LOGINFO
If the number of rows is greater than about 200 look at reducing them:
February 16, 2017 at 7:46 am
Your problem is the cursor.
It is difficult to write a set based approach without DDL and comsumable test data.
Try something like the following:
--declare @TransactionEffDate DATETIME2(3) =...
February 16, 2017 at 7:22 am
Bulk-logged still has a couple of problems:
1. Point in time recovery will not work for any time the DB was in bulk-logged mode.
2. While bulk-logged recovery logs less...
February 14, 2017 at 6:26 am
You might be able to use fn_dump_dblog to read your log backups:
December 20, 2016 at 7:38 am
Viewing 15 posts - 571 through 585 (of 1,496 total)