Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

No Limits Expand / Collapse
Author
Message
Posted Wednesday, August 1, 2012 9:42 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 9:02 PM
Points: 33,153, Visits: 15,284
Comments posted to this topic are about the item No Limits






Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1338935
Posted Thursday, August 2, 2012 2:26 AM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Yesterday @ 3:04 AM
Points: 1,654, Visits: 1,085
I'll cheerfully admit I would not even have a clue how to find out what the problems might be that require $2m/day worh of computing, let alone how to submit such problems to the machine's mighty maw. I too would love to hear from anyone that does.

In the meantime this article linked at the bottom of one of Steve's link gives some food for thought (SFW unless you are at a religious institute or something I would think) about the problems faced in a 'sister' industry of our own. Redis sounds interesting.
Post #1339005
Posted Thursday, August 2, 2012 3:57 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Yesterday @ 3:23 PM
Points: 67, Visits: 421
As a DBA the first thing that pops into my mind is large scale BI processing and data mining. This will put quite some pressure on the challenge of moving around large amounts of data both fast and secure. It also requires the provider to sell this kind of processing power for an hour a day or so. But in many cases the in-house processing for BI purposes requires additional powerful servers that will be nearly idle most of the day.

If a giant like Google is able to provide data processing to clients on a global scale, it would make better use of the required resources because at any time somewhere someone will need to process some data. After a huge initial amount of data, only the deltas are needed so the amount of data transferred could be quite manageable. Maybe in the future we will receive our daily cubes from Google ...
Post #1339040
Posted Thursday, August 2, 2012 6:39 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, November 1, 2013 1:47 PM
Points: 167, Visits: 164
I think Microsoft & Google have partnered up to use the Google engine to compute how many times Steve has mentioned cloud computing to dispute his cloud sponsorship payments. I just wonder who the next Ross Perot will be with the cloud systems like this seemingly having vast amounts of non-use/down time. Maybe Google can use it to search for bigfoot or find out what makes crop circles, or the gene for male patterened baldness (i threw that in for steve)
Post #1339098
Posted Thursday, August 2, 2012 7:06 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, August 15, 2014 8:05 AM
Points: 212, Visits: 314
We've seen RFPs asking for 2 terabytes of data in the cloud with 24 hour recovery time, with the data encrypted at rest. I can't find anyone who can provide SQL Server Enterprise Edition (to use TDE and partitioning) at that scale as a cloud provider (which to me means the replication over to a failover node and data is handled by them, or so I hope). Just investigating this -- any suggestions?

Your point about data quality is very important. Having quality data, or at least understanding the limits of your data, is critical in any data mining exercise. Hopelessly wrong or no conclusions can be arrived at when the inputs are incomplete or put together incorrectly. I hope that companies recognize that data governance is worth putting more resources into.
Post #1339123
Posted Thursday, August 2, 2012 7:19 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Thursday, June 26, 2014 7:28 AM
Points: 257, Visits: 902
Steve, if anyone does share with you their big cloud operations please beg for permission to share the gist of it with us.

One application that comes to mind for $2m/day computing costs is pharma research. If 770,000 cores can do in 1 day what on-site resources would do in a month, then the time saved is worth the money tradeoff. I would definitely like to hear about the type (and volume) of data as well as how/why it makes sense to outsource computation.
Post #1339143
Posted Thursday, August 2, 2012 8:53 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 9:02 PM
Points: 33,153, Visits: 15,284
thadeushuck (8/2/2012)
I think Microsoft & Google have partnered up to use the Google engine to compute how many times Steve has mentioned cloud computing to dispute his cloud sponsorship payments. I just wonder who the next Ross Perot will be with the cloud systems like this seemingly having vast amounts of non-use/down time. Maybe Google can use it to search for bigfoot or find out what makes crop circles, or the gene for male patterened baldness (i threw that in for steve)


LOL, I wish I was getting paid for sponsorships.

Cloud computing is an interesting topic, and it's different from most things we've seen before. I can't decide if different implementations are good or bad sometimes, and for the most part I think we are stuck doing a case by case analysis of where/when it works. So that means learning more about it.

I'd be happy to shave everything off the top of my head. Just waiting for my wife to give me the OK.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1339242
Posted Thursday, August 2, 2012 8:54 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 9:02 PM
Points: 33,153, Visits: 15,284
zintp (8/2/2012)
We've seen RFPs asking for 2 terabytes of data in the cloud with 24 hour recovery time, with the data encrypted at rest. I can't find anyone who can provide SQL Server Enterprise Edition (to use TDE and partitioning) at that scale as a cloud provider (which to me means the replication over to a failover node and data is handled by them, or so I hope). Just investigating this -- any suggestions?


No idea so far, but I'll look. I suspect only AWS/Azure/Google could do this right now. Most of the other offerings are less filled out.

The one idea is that most of the companies do offer just a VM, so you can install SQL Server EE and use TDE, however if you are doing that, you're essentially doing a co-location, but allowing someone else to buy the hardware. Not sure what the point is there.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1339246
Posted Thursday, August 2, 2012 8:56 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
I recently completed a project that required cleaning up over 100-million rows of name-and-address type data. It was a mess, and the clean-up was very, very resource-hungry. But I won't need those hardware resources again for any forseeable project here. Would have been great to rent some clock-cycles, et al, from Amazon or whomever, for the duration of the project. No budget for it, but we could plan that kind of potentiality into future budgets.

So I think this is a great trend with all kinds of potential uses.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1339249
Posted Thursday, August 2, 2012 9:15 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: Administrators
Last Login: Yesterday @ 9:02 PM
Points: 33,153, Visits: 15,284
Mike Dougherty-384281 (8/2/2012)
Steve, if anyone does share with you their big cloud operations please beg for permission to share the gist of it with us.

One application that comes to mind for $2m/day computing costs is pharma research. If 770,000 cores can do in 1 day what on-site resources would do in a month, then the time saved is worth the money tradeoff. I would definitely like to hear about the type (and volume) of data as well as how/why it makes sense to outsource computation.


I'll definitely try to share whatever I can learn. Some of the places I linked are examples of what I've seen. The big win seems to be the lack of investment needed for large scale computing. There's definitely a tipping point here, and I've seen this in the *Nix world before with large IBM machines where we had extra hardware in the machine that wasn't licensed to us. We could activate this for short periods as needed, paying a "rental" fee.

As an example, in the 2001/2002 area, we had a large 64 CPU AIX server, but we were licensed for 36 CPUs. That's what we "bought". At end of quarter, we could "rent" an additional 10-12 CPUs for 2-3 days, with a license key. AIX allowed hot-add of the CPUs, so this worked well for us. Our calculations showed that this was worthwhile until we needed about 90+ days. Since we were looking at 8-10 days a year, it was better to rent the CPUs than buy them.

I think that's what cloud computing gets you when it's done well. You can burst in those places you need to. If the load is steady, you probably do better with purchasing equipment at some point.







Follow me on Twitter: @way0utwest

Forum Etiquette: How to post data/code on a forum to get the best help
Post #1339269
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse