Avoiding Unexpected Results with DATEDIFF in Year Comparisons

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I disagree, and dont worry, I'm not offended (its not the first time I disagree with other users online) 😉 ... But lets just agree to disagree, like i wrote in my first post; its the shortest and fastest to get the correct year diference between two dates within the range of 1900 and 2400.

  • Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I had a similar thing, working on a system which worked on stock marked indices: "You can set the maximum for Dow Jones to 10.000" I was told :blink:
    😎
    It reached 10.000 in early 1999.

  • Eirikur Eiriksson - Wednesday, April 11, 2018 2:38 AM

    Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I had a similar thing, working on a system which worked on stock marked indices: "You can set the maximum for Dow Jones to 10.000" I was told :blink:
    😎
    It reached 10.000 in early 1999.

    But did you store that data in a data type that could ONLY hold up to 10000, I think not.. so again it's not the same...

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 12:32 AM

    Well, working with financial data, i wont ever come to any dates before 1900 and certainly not date after 2200... so i don't see why i should be bothered, with failures outside my range... But if you think your program will survive 200 years from now.. be my guest at take ALL exceptions that may and may not occur into consideration. :rolleyes:

    You are not understanding what I am saying. You don't understand the complexities of the data and you prefer hacks over solutions. This kind of thinking is dangerous and I don't want it anywhere near me.

  • GeorgeCopeland - Wednesday, April 11, 2018 6:32 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 12:32 AM

    Well, working with financial data, i wont ever come to any dates before 1900 and certainly not date after 2200... so i don't see why i should be bothered, with failures outside my range... But if you think your program will survive 200 years from now.. be my guest at take ALL exceptions that may and may not occur into consideration. :rolleyes:

    You are not understanding what I am saying. You don't understand the complexities of the data and you prefer hacks over solutions. This kind of thinking is dangerous and I don't want it anywhere near me.

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    Covering a large hole in your wall with tarpaulin "works"; but i would never say it was a solution. 🙂

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Thom A - Wednesday, April 11, 2018 7:07 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    Covering a large hole in your wall with tarpaulin "works"; but i would never say it was a solution. 🙂

    I cant compare... :Wow:😎 sorry... once again I (as in I) won't ever get outside the range fromwhere all the issues occure. so why bother ??? im not prepering my code for 128, 256 or 512 bit systems either... 😉

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:41 AM

    Eirikur Eiriksson - Wednesday, April 11, 2018 2:38 AM

    Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I had a similar thing, working on a system which worked on stock marked indices: "You can set the maximum for Dow Jones to 10.000" I was told :blink:
    😎
    It reached 10.000 in early 1999.

    But did you store that data in a data type that could ONLY hold up to 10000, I think not.. so again it's not the same...

    Legacy system, numbers stored as characters in text journals, the debate was 4 or 5 characters allocated 😉
    😎

    Of course I ignored it and used 5 => '99999' so if the system is still in use which I doubt, it should still be fine.
    Heard of some frantic work on other similar systems in early 1999 and that wasn't the 2K problem.

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    GeorgeCopeland - Wednesday, April 11, 2018 6:32 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 12:32 AM

    Well, working with financial data, i wont ever come to any dates before 1900 and certainly not date after 2200... so i don't see why i should be bothered, with failures outside my range... But if you think your program will survive 200 years from now.. be my guest at take ALL exceptions that may and may not occur into consideration. :rolleyes:

    You are not understanding what I am saying. You don't understand the complexities of the data and you prefer hacks over solutions. This kind of thinking is dangerous and I don't want it anywhere near me.

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    You might want to reconsider you stance on this subject.  Here's using your code using dates that would be in your useful date range.

    DECLARE @STARTDATE DATETIME = '20180411';
    DECLARE @TODAY     DATETIME = '20190411';
     SELECT FLOOR(DATEDIFF(DAY, @STARTDATE, @TODAY) * 1.0 / 365.25);

    RESULTS:
    ---------------------------------------
    0

    (1 row(s) affected)

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Eirikur Eiriksson - Wednesday, April 11, 2018 7:17 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:41 AM

    Eirikur Eiriksson - Wednesday, April 11, 2018 2:38 AM

    Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I had a similar thing, working on a system which worked on stock marked indices: "You can set the maximum for Dow Jones to 10.000" I was told :blink:
    😎
    It reached 10.000 in early 1999.

    But did you store that data in a data type that could ONLY hold up to 10000, I think not.. so again it's not the same...

    Legacy system, numbers stored as characters in text journals, the debate was 4 or 5 characters allocated 😉
    😎

    Of course I ignored it and used 5 => '99999' so if the system is still in use which I doubt, it should still be fine.
    Heard of some frantic work on other similar systems in early 1999 and that wasn't the 2K problem.

    im speechless :laugh::laugh::laugh: numbes as characters... hmmm in what world was that ever a good idea... cough cough :alien::Whistling::Whistling::Whistling:

  • Jeff Moden - Wednesday, April 11, 2018 7:23 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    GeorgeCopeland - Wednesday, April 11, 2018 6:32 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 12:32 AM

    Well, working with financial data, i wont ever come to any dates before 1900 and certainly not date after 2200... so i don't see why i should be bothered, with failures outside my range... But if you think your program will survive 200 years from now.. be my guest at take ALL exceptions that may and may not occur into consideration. :rolleyes:

    You are not understanding what I am saying. You don't understand the complexities of the data and you prefer hacks over solutions. This kind of thinking is dangerous and I don't want it anywhere near me.

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    You might want to reconsider you stance on this subject.  Here's using your code using dates that would be in your useful date range.

    DECLARE @STARTDATE DATETIME = '20180411';
    DECLARE @TODAY     DATETIME = '20190411';
     SELECT FLOOR(DATEDIFF(DAY, @STARTDATE, @TODAY) * 1.0 / 365.25);

    RESULTS:
    ---------------------------------------
    0

    (1 row(s) affected)

    +365.25
    😎
    FLOOR or CEILING on an approximation of an average can never be exact.

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    True story: We will never need more than a billion in our currency column. The first Japanese company we tried to bring online broke our system with Yen. I am not saying over-engineer a system for every possible contingency. What I am saying is this: don't make your system break because you hacked the currency datatype. And if you will hack a date, you will hack anything. Final thing: datetime is a data structure. Microsoft could change the way it works tomorrow. Everything using datetime functions will still work. Your hack will not.

  • Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 7:24 AM

    Eirikur Eiriksson - Wednesday, April 11, 2018 7:17 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:41 AM

    Eirikur Eiriksson - Wednesday, April 11, 2018 2:38 AM

    Thom A - Wednesday, April 11, 2018 2:26 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 2:16 AM

    I guess my point was missed as well... systems that are over 50 years old, should defently have be updated along the way, and should you under an upgrade find, that you would encounter such an calculation issue, which I highly doubt, then upgrade... there's  absolutly NO need to take into account ranges that are not in your scope, regardless. THIS is not an storage issue, its a calculation issue which wont ever be effected by what I do, and I'm 10000000000 % sure that my program/calculation running on a SQL server wont be here in 200 years.... and the Y2K bug, was not a calculation issue, it was storage... (YY)

    And, this is why the Y2K bug existed. Because of that mentality (no offence).

    I had a similar thing, working on a system which worked on stock marked indices: "You can set the maximum for Dow Jones to 10.000" I was told :blink:
    😎
    It reached 10.000 in early 1999.

    But did you store that data in a data type that could ONLY hold up to 10000, I think not.. so again it's not the same...

    Legacy system, numbers stored as characters in text journals, the debate was 4 or 5 characters allocated 😉
    😎

    Of course I ignored it and used 5 => '99999' so if the system is still in use which I doubt, it should still be fine.
    Heard of some frantic work on other similar systems in early 1999 and that wasn't the 2K problem.

    im speechless :laugh::laugh::laugh: numbes as characters... hmmm in what world was that ever a good idea... cough cough :alien::Whistling::Whistling::Whistling:

    Guess you haven't seen anything older than Windows 😉
    😎

    These systems were running on the likes of IBM System/390 ES/9000 and HP 3000, based on even older systems.

  • GeorgeCopeland - Wednesday, April 11, 2018 7:34 AM

    Paw Jershauge (Orifarm) - Wednesday, April 11, 2018 6:42 AM

    I still disagree with you, I fully understand my datatype and ranges, for me, this is not HACK's, its just as valid solution as the one that covers ALL posiple pitfalls. Especially since i know my ranges and datatypes.
    So regardless that you dont want it near you, it still works.

    True story: We will never need more than a billion in our currency column. The first Japanese company we tried to bring online broke our system with Yen. I am not saying over-engineer a system for every possible contingency. What I am saying is this: don't make your system break because you hacked the currency datatype. And if you will hack a date, you will hack anything. Final thing: datetime is a data structure. Microsoft could change the way it works tomorrow. Everything using datetime functions will still work. Your hack will not.

    It's especially painful when the hack doesn't actually work for the supposedly well understood range of data.  See my previous post above.  It had to fail because 365/365.25 will never equal something greater than 1 for FLOOR to work with correctly.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 15 posts - 31 through 45 (of 51 total)

You must be logged in to reply to this topic. Login to reply