Viewing 15 posts - 1,201 through 1,215 (of 1,468 total)
CREATE TABLE #TestData (
SampleTime DATETIME
);
INSERT INTO #TestData ( SampleTime )
VALUES ( '2017-03-27 09:24:00.000' )
, ( '2017-03-27 09:48:00.000' )
, ( '2017-03-27...
April 7, 2017 at 7:22 am
Thank you for the response.
You're SQL code works and I achieve the Running Total...
April 7, 2017 at 5:00 am
Your requirements for filtering the data are quite vague.
The code below will join the data, and return it all. You can tweak the WHERE as needed.
April 7, 2017 at 12:32 am
The reason that you are getting duplicates is because your source data is all the same. If you want to prevent duplicates in your results, then you need to find...
April 7, 2017 at 12:03 am
This should do the trick
CREATE TABLE #Company (CompanyID INT, CompanyName VARCHAR (100));
CREATE TABLE #KVPairs (CompanyID INT, KeyColumn VARCHAR (100), ValueColumn VARCHAR (100));
INSERT...
April 6, 2017 at 10:24 pm
You can try this
INSERT INTO DCert (stateID, custnumber, lastname, firstname, mailAddress)
WITH cteData AS (
SELECT
DC.[State Id] as stateID
,...
April 6, 2017 at 3:19 pm
April 6, 2017 at 3:08 pm
Does this give you the required output?
-- This date is used so that SQL does not get an overflow error when calculating a large time...
April 6, 2017 at 1:14 pm
April 6, 2017 at 12:23 pm
It seems you have an issue with duplicate records for PartNum, DueDate, Quantity).
This should do the trick
WITH cteBaseData AS (
SELECT
PartNum
April 6, 2017 at 5:47 am
Something along the lines of .....
SELECT as many random...
April 5, 2017 at 1:37 pm
This sounds like a running totals issue.
Something along the lines of .....
SELECT as many random questions as needed
WHERE QuestionType = 'Short Note'
AND QuestionComplexity...
April 5, 2017 at 12:52 pm
What about an option to MARK ALL AS READ.
Then when looking at the UNREAD tab, we can start with fresh data.
April 5, 2017 at 11:38 am
Viewing 15 posts - 1,201 through 1,215 (of 1,468 total)