April 29, 2016 at 7:37 am
Brandie that is funny!
Rodders...
April 29, 2016 at 7:50 am
So today is the last day of my current contract / gig.
Been a very odd week, what with being off for two weeks.
So got tasked with picking up some bits and pieces on the back of the data warehouse we were working on.
had a meeting on Tuesday with a plan of action. Deployed some of the views to the test environment for review. Emailed the person in MI with "send me some feedback"... heard nothing. had the hand over meeting just now and the MI person oh yes they are fine... I was expecting something quite different...
Meanwhile the big project that went live last week has had problems since then, and this week has been manic for those working on it. I should have been involved but wasn't. And with a week to go, it wasn't worth anyone bring me up to speed. So its been a very odd atmosphere in the office. Complete mayhem and madness going on around me, and me sitting in the calm eye of the hurricane.
Its been an odd six month contract anyway so this week seems a rather apt, slight flat way to finish!
Next week!
For those going to SQLNexus - have fun, would loved to made that one, but it overlaps with...
SQLBits - make sure to say hi ... It will be good to catch up with the usual mob (Gail, Steve, Grant etc... 😉 )
And if we haven't met in person yet or this is your first SQLBits make sure to come and say hi.
I'll be wearing Orange along with the other helpers, so I will be in registration at various times during the week. And probably around Tuesday night for the usual pre start set up (bag packing etc).
Attending / Room Monitoring Chris Atkins training day on Wednesday, Kalen Delany's on Thursday, and I think I'm loitering around the community corner Friday lunchtime!
Now for the long, long drive home... Its Friday, I have to go via the M25 car park, and it's raining... oh dear...
I know now why I work from home on a Friday!
Cheers,
Rodders...
April 29, 2016 at 8:02 am
rodjkidd (4/29/2016)
SQLBits - make sure to say hi ... It will be good to catch up with the usual mob (Gail, Steve, Grant etc... 😉 )
Errr....
http://sqlinthewild.co.za/index.php/2016/04/26/upcoming-conferences/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
April 29, 2016 at 8:06 am
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).
All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 29, 2016 at 9:34 am
Looking forward to seeing you all at Nexus/Bits.
April 29, 2016 at 10:22 am
Grant Fritchey (4/29/2016)
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
As punishment for bringing "Real Work(TM)" into The Thread, you must now buy everyone who comes up to you at an event, who utters the phrase "the hippo sent me with the pork chop launcher," a drink, not to exceed $3 USD, from now until 6 May 2016.
:-D:hehe:
April 29, 2016 at 10:46 am
jasona.work (4/29/2016)
Grant Fritchey (4/29/2016)
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
As punishment for bringing "Real Work(TM)" into The Thread, you must now buy everyone who comes up to you at an event, who utters the phrase "the hippo sent me with the pork chop launcher," a drink, not to exceed $3 USD, from now until 6 May 2016.
:-D:hehe:
Love it!
Wish I could help you with the paging issue, Grant, but I have not done any testing on it. And all the data sets I deal with are VSDB types.
April 29, 2016 at 10:59 am
April 29, 2016 at 11:14 am
No. I meant it as a play on VLDBs as in Very Small Databases. But that's an interesting link. Thanks for posting it.
April 29, 2016 at 11:19 am
jasona.work (4/29/2016)
Grant Fritchey (4/29/2016)
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
As punishment for bringing "Real Work(TM)" into The Thread, you must now buy everyone who comes up to you at an event, who utters the phrase "the hippo sent me with the pork chop launcher," a drink, not to exceed $3 USD, from now until 6 May 2016.
:-D:hehe:
Hmmm... that's an interesting choice on dates.
Besides, according to this, I'm entitled to drinks from everyone.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 29, 2016 at 11:25 am
Grant Fritchey (4/29/2016)
jasona.work (4/29/2016)
Grant Fritchey (4/29/2016)
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
As punishment for bringing "Real Work(TM)" into The Thread, you must now buy everyone who comes up to you at an event, who utters the phrase "the hippo sent me with the pork chop launcher," a drink, not to exceed $3 USD, from now until 6 May 2016.
:-D:hehe:
Hmmm... that's an interesting choice on dates.
Besides, according to this, I'm entitled to drinks from everyone.
You may be, but that has little impact on you providing drinks to others.
April 29, 2016 at 11:38 am
GilaMonster (4/29/2016)
rodjkidd (4/29/2016)
SQLBits - make sure to say hi ... It will be good to catch up with the usual mob (Gail, Steve, Grant etc... 😉 )Errr....
http://sqlinthewild.co.za/index.php/2016/04/26/upcoming-conferences/
errare humanum est.
😎
Homo non sum
April 29, 2016 at 11:47 am
So a coworker attended a gender diversity workshop. She forwarded me the presentation with a slide that says women constitute 40% of database administrators, the highest of any computer science occupation.
Other occupations break down as thus:
Web Developers: 37%
Computer System Analysts: 35%
Computer & Information systems Managers: 30%
and it keeps going down to computer network architects: 11%
So what does it say about database administration that this job attracts more women than any of the other computer science occupations? Actually, what does it say about DBA culture that gender diversity is so high?
April 29, 2016 at 11:50 am
Brandie Tarvin (4/29/2016)
So a coworker attended a gender diversity workshop. She forwarded me the presentation with a slide that says women constitute 40% of database administrators, the highest of any computer science occupation.Other occupations break down as thus:
Web Developers: 37%
Computer System Analysts: 35%
Computer & Information systems Managers: 30%
and it keeps going down to computer network architects: 11%
So what does it say about database administration that this job attracts more women than any of the other computer science occupations? Actually, what does it say about DBA culture that gender diversity is so high?
Good things. Is this published data?
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 29, 2016 at 11:53 am
Grant Fritchey (4/29/2016)
jasona.work (4/29/2016)
Grant Fritchey (4/29/2016)
Just a follow up on my other post (and apologies for bring tech & real work into the Thread, but you guys have the answers so I may as well post the question here).All the feedback was good and helpful. Now, the remaining question is on the paging. I know of a number of different ways to get this done. However, I suspect you guys have run tests and can make a good recommendation on the best method for breaking apart the data for export (regardless of the precise method of the export). Please don't get hung on the 10,000 row value either. It was just an arbitrary number. Could be 5,000 or 1,000,000. I just need to be able to efficiently chunk the data needed for export.
Vielen Danke!
As punishment for bringing "Real Work(TM)" into The Thread, you must now buy everyone who comes up to you at an event, who utters the phrase "the hippo sent me with the pork chop launcher," a drink, not to exceed $3 USD, from now until 6 May 2016.
:-D:hehe:
Hmmm... that's an interesting choice on dates.
Besides, according to this, I'm entitled to drinks from everyone.
Well, I figured for a first-time offense, a light sentence was warranted...
Plus, I have no idea if there even *ARE* any events that you'd be attending in that time frame, and didn't want to stick you with buying drinks for months to come...
But a second offense, well...
We'll just keep the day and month the same, and increase the year...
😀
Viewing 15 posts - 53,851 through 53,865 (of 66,819 total)
You must be logged in to reply to this topic. Login to reply