July 7, 2017 at 5:54 am
Good evening Forum members,
I am a total novice/beginner, so please bear with me in case I miss out on vital details below.
I have a table with column, 'created_date' with values such as the following:
1409030583 |
1409030820 |
1409031813 |
1409103924 |
Which I converted to the following
Line 1: 26/08/2014 5:23 |
Line 2: 26/08/2014 5:27 |
Line 3: 26/08/2014 5:44 |
Line 4: 27/08/2014 1:45 |
However the time values obtained as above are 2 hours (approx.) ahead of the values originally captured ( highlighted in yellow).
Line 1: 3802 (26-08-2014 15:23)
Line 2: 8332 (26-08-14 15:27)
Line 3: 5393 (26-08-14 15:43)
Line 4: 3608 (27-08-14 11:45)
I am using SSMS 2017 and would appreciate any guidance on an appropriate T-SQL script to obtain the correct results for these and a couple of thousand other records with similar issue.
Thanks
July 7, 2017 at 5:59 am
That's an unusual source format. What is the datatype of your Created_Date column and how did you perform the conversion?
July 7, 2017 at 6:15 am
Have to agree with Phil here, how does 1409030583 become 2014-08-26 05:23:00.000? I can't see any correlated numbers in these two values, apart from that (possibly) the 14 at the start of the integer value represents the year.
Trying the value as Seconds/Minutes since date 0 (1900-01-01 by default), yields the dates 4579-01-09 11:03:00.000 and 1944-08-26 05:23:03.000. Wait...
...
...
Okay... So, Phil, it looks like it's seconds since 1970-01-01 (very odd, I must say):SELECT DATEADD(SECOND, S,'1970') AS Date
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
Returns:Date
-----------------------
2014-08-26 05:23:03.000
2014-08-26 05:27:00.000
2014-08-26 05:43:33.000
2014-08-27 01:45:24.000
So, what is wrong with these dates then? Could you perhaps provide your SQL here please OP? We need more detail here.
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
July 7, 2017 at 6:15 am
Hi Phil, Thanks for reviewing and responding to my post.
The data type is INT.
Conversion from INT was done using the following approach
Update [dbo].["tableName"]
Set ["newColumnName"] = DateADD(second,["oldColumnName"],{d '1970-01-01'})
Thanks
JP
July 7, 2017 at 6:19 am
jp.ostrayla - Friday, July 7, 2017 6:15 AMHi Phil, Thanks for reviewing and responding to my post.The data type is INT.
Conversion from INT was done using the following approachUpdate [dbo].["tableName"]
Set ["newColumnName"] = DateADD(second,["oldColumnName"],{d '1970-01-01'})
Thanks
JP
Unix epoch!
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
July 7, 2017 at 6:28 am
Also, 26/08/2014 5:23 is not 2 hours ahead of 26-08-2014 15:23, it's 10 hours before. 26/08/2014 5:23 = 26/08/2014 05:23 = 26/08/2014 5:23am.
Are all the times out by 10 hours? If so, then it seems that the value 0 actually represents '1900-01-01 10:00:00.000'. Perhaps this would solve your problem:SELECT DATEADD(HOUR, 10, DATEADD(SECOND, S,'1970')) AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
--Or you could do:
SELECT DATEADD(SECOND, S,'19700101 10:00:00.000')AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
July 7, 2017 at 6:30 am
jp.ostrayla - Friday, July 7, 2017 6:15 AMHi Phil, Thanks for reviewing and responding to my post.The data type is INT.
Conversion from INT was done using the following approachUpdate [dbo].["tableName"]
Set ["newColumnName"] = DateADD(second,["oldColumnName"],{d '1970-01-01'})
Thanks
JP
The period from 5.23 to 15.23 is 10 hours, so your converted values are 10 hours behind the results which you desire. Could it be a timezone issue? Perhaps the date is stored in UTC, but you want UTC + 10?
Building on Thom's code:
SELECT Date = DATEADD(HOUR, 10, DATEADD(SECOND, V.S, '1970'))
FROM
(
VALUES
(1409030583)
, (1409030820)
, (1409031813)
, (1409103924)
) V (S);
July 7, 2017 at 6:31 am
Thom A - Friday, July 7, 2017 6:28 AMAlso, 26/08/2014 5:23 is not 2 hours ahead of 26-08-2014 15:23, it's 10 hours before. 26/08/2014 5:23 = 26/08/2014 05:23 = 26/08/2014 5:23am.Are all the times out by 10 hours? If so, then it seems that the value 0 actually represents '1900-01-01 10:00:00.000'. Perhaps this would solve your problem:
SELECT DATEADD(HOUR, 10, DATEADD(SECOND, S,'1970')) AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
--Or you could do:
SELECT DATEADD(SECOND, S,'19700101 10:00:00.000')AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
Bravo!
July 7, 2017 at 6:33 am
Thom A - Friday, July 7, 2017 6:28 AMAlso, 26/08/2014 5:23 is not 2 hours ahead of 26-08-2014 15:23, it's 10 hours before. 26/08/2014 5:23 = 26/08/2014 05:23 = 26/08/2014 5:23am.Are all the times out by 10 hours? If so, then it seems that the value 0 actually represents '1900-01-01 10:00:00.000'. Perhaps this would solve your problem:
SELECT DATEADD(HOUR, 10, DATEADD(SECOND, S,'1970')) AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
--Or you could do:
SELECT DATEADD(SECOND, S,'19700101 10:00:00.000')AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
Perhaps those are UTC times, from a server where TZ offset = 10
July 7, 2017 at 6:38 am
Phil Parkin - Friday, July 7, 2017 6:30 AMCould it be a timezone issue? Perhaps the date is stored in UTC, but you want UTC + 10?
Oh, nice thought! With that in mind:SELECT SWITCHOFFSET(DATEADD(SECOND, S,'1970'),'+10:00') AS DateValue
FROM (VALUES (1409030583),(1409030820),(1409031813),(1409103924)) V(S);
Returns:DateValue
----------------------------------
2014-08-26 15:23:03.000 +10:00
2014-08-26 15:27:00.000 +10:00
2014-08-26 15:43:33.000 +10:00
2014-08-27 11:45:24.000 +10:00
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
July 7, 2017 at 6:43 am
Thanks very much Phil & Thom for being generous with your time and valuable guidance.
What changes to the suggested code do I need to make to apply it to the relevant table which includes a 'created_date' column with over 120000 records?
July 7, 2017 at 6:49 am
jp.ostrayla - Friday, July 7, 2017 6:43 AMThanks very much Phil & Thom for being generous with your time and valuable guidance.What changes to the suggested code do I need to make to apply it to the relevant table which includes a 'created_date' column with over 120000 records?
Change this
Set ["newColumnName"] = DateADD(second,["oldColumnName"],{d '1970-01-01'})
to this
Set ["newColumnName"] = DATEADD(HOUR, 10, DateADD(second,["oldColumnName"],{d '1970-01-01'}))
July 7, 2017 at 6:51 am
jp.ostrayla - Friday, July 7, 2017 6:43 AMThanks very much Phil & Thom for being generous with your time and valuable guidance.What changes to the suggested code do I need to make to apply it to the relevant table which includes a 'created_date' column with over 120000 records?
Is the new column to be a static value, or can it be dervived? From your above query, I assume derived is fine, so, you could do:--Create sample table
CREATE TABLE SampleDates (IntValue bigint);
GO
--Insert sample data
INSERT INTO SampleDates
VALUES (1409030583),(1409030820),(1409031813),(1409103924);
GO
--Check
SELECT *
FROM SampleDates;
GO
--New Dervied Column
ALTER TABLE SampleDates ADD DateValue AS SWITCHOFFSET(DATEADD(SECOND, IntValue,'1970'), '+10:00');
GO
--Check
SELECT *
FROM SampleDates;
GO
--Cleanup
DROP TABLE SampleDates;
GO
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
July 7, 2017 at 7:05 am
That worked perfectly!!
Given this was my first foray into posting and both of you were absolutely fantastic, I am immensely thankful for your interest in checking my message and brilliant guidance on resolving this issue.
Thanks once again and I wish you both a lovely day.
Viewing 14 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply