Viewing 15 posts - 16 through 30 (of 205 total)
Scott, I'm looking through the query, running against my full dataset and it looks pretty close. I'll need to add more columns to my final dataset as there are a...
August 7, 2020 at 12:04 am
Actually, I used all of the values listed in the table as the values which identify duplicates (those are what the Row_Number used to create the RowNum column). Does that...
August 6, 2020 at 6:06 pm
Not quite. The RIN# is not a unique number here, that number will exist multiple times on a day and provider so this will not return the expected results across...
August 6, 2020 at 5:57 pm
You're right, we were trying to eliminate the duplicates as that was not correct. Sample data was posted at the beginning of the post, but I was unable to...
April 28, 2020 at 10:06 pm
I think it's fixed! I just ran the code you just sent with the Not Exists change and I'm no longer seeing duplicates with any of the months I'm running. ...
April 28, 2020 at 4:46 pm
Thanks Brian. Yep, they're definitely duplicates.
Tried this code and I'm still seeing dups. For the February data there are 250 of them.
Would it help to see what the dups look...
April 28, 2020 at 2:55 pm
So that reduced the number of duplicates down to 268, but didn't eliminate.
I'm trying to see if I can reproduce the issue with a smaller subset of data that I...
April 27, 2020 at 10:33 pm
Ok Brian, I'm scratching my head here now.
I ran this same query for a different date range and I'm now getting a bunch of duplicated records for some reason. I'm...
April 27, 2020 at 9:12 pm
Man, I applaud your perseverance on this as it's truly impressive.
My validation looks like this is indeed the trick! Since the full dataset has over 29K records it takes about...
April 23, 2020 at 11:18 pm
Wow! This is by far the closest we've gotten. Great work.
Here's the odd thing: When I run this query against the test data it's dropping one of the trips from...
April 23, 2020 at 8:02 pm
I still can't quite get this figured out. It looks like the groupings will work based on the License# but I can't figure out the groupings within a license# (multiple...
April 23, 2020 at 2:59 pm
I think I just got something!
I tried this:
SELECT DISTINCT t.rn,t.ldate, t.TripID ,t.clientid,t.license#,t.AAT,t.ADTinto #testFROM #TestTrips tINNER JOIN #TestTrips t2 ON [t].ldate = [t2].ldate AND [t].[license#] = [t2].[license#] AND...
April 22, 2020 at 6:14 pm
Ok, first question: Yes, they would all be one group. Essentially what we're trying to establish here is when one vehicle actually is transporting more than one passenger at a...
April 22, 2020 at 5:46 pm
All of the trips in the dataset should be overlapping with at least 1 other trip, so there should always be at least 2 trips in each group but some...
April 22, 2020 at 5:10 pm
Sorry, I misread your post initially....
Each trip listed has to be unique, so for all the trips that overlap eachother, that should be one grouping. So the groupings of 2...
April 22, 2020 at 4:38 pm