Out of control table

  • Why can't you just use MAX([File_Date]) as End_File_Date?

  • Are you looking to replace the existing table with a cleaned up version or do you intend to leave the existing table as is and create a separate table or view containing the sumarized data ?

    The SQL is straight forward to sumarise with first and last dates.

    SELECT col1, col2, col2, MIN(date), MAX(date) GROUP by col1, col2, col3

    Can you explain what you mean when you say a record was removed from the source file ?

    If you could provide sample data for a single account number showing the raw data and the desired output it would help us figure out the context of your query. Ideally keep it simple with col1, col2, col3 rather than all the columns. If you invest the time and provide a script to generate the table structure and the sample data you will get a response quicker.

  • The problem is that the source data that has been loaded into the table for the last 4 years is a csv, that contains all the records each day. What I mean is that the file that was loaded on 3/2/2015 has data from 3/1/2015 back to 6/7/2011, and then the file loaded on 3/3/2015 has all the data from 3/2/2015 back to 6/7/2011. Whoever set this up is not only loading the new records they are appending the entire file to the end of the table. BUT for some reason I am finding that there are records that drop out in the middle so they were not provided in all of the source files for some reason.

  • CELKO (3/12/2015)

    Since BIGINT is larger than the number of atoms in the universe, we do not use it very often.

    No, not even close. There are at least 10^78 atoms, whereas bigint only covers just under 10^20.

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • CELKO (3/17/2015)


    No, not even close. There are at least 10^78 atoms, whereas bigint only covers just under 10^20.

    My bad :rolleyes:

    A quick Google: it is believed that between 120 to 300 sextillion (that’s 1.2 x 10²³ to 3.0 x 10²³) stars exist within our observable universe.

    So if your company is doing business with the observable universe, then you need BIGINT for a key. And probably a stardate instead of a Commoin Era Calendar 😉

    According to the UK Government, BIGINT is needed to cover the number of foreigners that want to migrate to the UK, guess they are coming from all these places in the observable universe

    😎

  • ScottPletcher (3/16/2015)


    CELKO (3/12/2015)

    Since BIGINT is larger than the number of atoms in the universe, we do not use it very often.

    No, not even close. There are at least 10^78 atoms, whereas bigint only covers just under 10^20.

    That would be closer to the age of the universe (in seconds), one can easily confuse those numbers;-)

    😎

  • Eirikur Eiriksson (3/17/2015)


    ScottPletcher (3/16/2015)


    CELKO (3/12/2015)

    Since BIGINT is larger than the number of atoms in the universe, we do not use it very often.

    No, not even close. There are at least 10^78 atoms, whereas bigint only covers just under 10^20.

    That would be closer to the age of the universe (in seconds), one can easily confuse those numbers;-)

    😎

    I love it when a forum topic drifts completely off-topic 😛

  • Since BIGINT is larger than the number of atoms in the universe, we do not use it very often. What math requires this size?

    The reason for using BIGINT probably has something to do with the maximum size that INT can handle.

Viewing 8 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply