The fastest to way to cut a data center electric bill is to pull the plug. Of course that would have the CEO screaming more than Rex Ryan after another Mark Sanchez completion -- to the other team!
You can't just shut your data center down unless your Chapter 7 says you can. But you can move apps from your data center to the cloud, shifting the energy load to the service provider.
These service providers are acutely interested in highly efficient, green data centers. There are all kinds of techniques they use: efficient cabling, directing cooling to only that which is hot, and virtualizing everything to reduce the number of devices that need cooling.
The biggest advantage to cloud providers is not having to locate data centers near or in headquarters and other corporate facilities. Already this decoupling has put data centers in cooler cellars, deep dark caves and cold climates where data center managers simply open the windows (yes, the air passes through a filter but it is outside air, not A/C) to cool the equipment.
Tate Cantrell, CTO for Verne Global, is keen in Iceland -- Keflavik, Iceland specifically -- for data center location. And it's not the cool climate so much as how Iceland generates power, Cantrell explains in an interview with Enterprise Systems Journal.
Iceland has a modern grid and are aggressive in pursuing renewable energy such as geothermal and hydro.
An efficient service provider data center located in an energy-efficient country should provide low-cost services and be good for the environment to boot. "Cloud providers have the ability to centralize and innovate in a way that improves efficiency on a scale that an individual company may not have the resources to do on their own. Cloud providers can locate their data centers where power grids are most robust in terms of capacity, where climates cater to free-cooling, and where sources of energy are not solely dependent upon coal," says Cantrell.
Posted by Doug Barney on 10/23/2012 at 12:47 PM2 comments
Big data is ideal in some ways for the cloud. Once the data is in the cloud, the processing takes place remotely the results and analysis and updates are all that go up or down the wire. Simply put, big data apps shouldn't be chatty like an end-user productivity app with constant interaction.
But big data apps aren't sold at Best Buy. They usually aren't sold at all, but built. Developers tools, therefore, are paramount.
Precog has an infrastructure platform and tools and APIs for big data that is now in beta test.
While aimed at developers, the platform is claimed to need less hard-core development. For instance, Precog "eliminates the need for development teams to learn Hadoop and related complex data storage and analysis technologies, freeing them to focus on core application functionality," the company says.
The company offers a free version as well as a $12,000-a-month platinum edition.
Posted by Doug Barney on 10/16/2012 at 12:47 PM2 comments
Hadoop is one crazy name for one crazy technology. Apparently the technology was named after a toy elephant, though it still sounds like it tastes pretty good!
Jack Norris with MapR Technologies, which offers a Hadoop distribution, recently told Enterprise Systems Journal why Hadoop makes so much sense in the cloud.
Before we tackle they why's, what is Hadoop anyway. According to Norris "it is a platform that allows enterprises to store and analyze large and growing unstructured data more effectively and economically than ever before. With Hadoop, organizations can process and analyze a diverse set of unstructured data including clickstreams, log files, sensor data, genomic information, and images."
That alone makes it perfect for the cloud. "Hadoop represents a paradigm shift. Instead of moving the data across the network, it's much more effective to combine compute on the data and send the results over the network," Norris explains.
This makes a lot of sense. Processing in the cloud is most often speedy on state of the art servers (the rub being, if they are over-virtualized or otherwise overcommitted).
What slows the cloud is the network in and out, so maximized processing in the cloud and minimizing transmission makes perfect sense.
The cloud is perfect for Hadoop because all the unstructured data Hadoop deals with is growing close to exponentially. "The issue is that these data sources are typically unstructured like social media or sensor data and are growing in volumes that outstrip the ability to process them with the existing tools and processes," Norris says. "Hadoop removes all these obstacles by providing a radically different framework that allows for easy scale-out of systems and for processing power to be distributed. Data from a wide variety of sources can be easily loaded and analyzed with Hadoop. There's no need to go through a lengthy process to transform data and a broad set of analytic techniques can be used."
But this is all too much for the typical data center. The cloud is ideal for handling all this growth.
Fortunately some cloud providers have distinct services for Hadoop.
Posted by Doug Barney on 10/16/2012 at 12:47 PM0 comments
Security is still bugaboo number one when it comes to the cloud. CipherCloud is hoping to make the cloud less scary with encryption offered through the new CipherCloud Database Gateway.
This tool works with a range of cloud types everything from IaaS to PaaS and SaaS.
The database-oriented gateway encrypts the data while it is "at rest." It works with your own database instances such as SQL Server and Oracle cloud installs as well as database services such as Oracle Database Cloud Service, Microsoft SQL Azure and Amazon RDS.
With your data fully encrypted, worries about hackers, compliance, data leakage and privacy should all be curtailed.
Posted by Doug Barney on 10/11/2012 at 12:47 PM4 comments
Packaged software is pretty straightforward. You buy it for a set price, use it, and some time later either buy another or an upgrade. Even with volume licenses, as complex as they can be, you are still essentially just getting a discount with full rights to the package.
Boy, oh boy, is the cloud different. Here you buy access. Sometimes you are buying access to what used to be a packaged app and this is usually fairly simple. Other times you are buying or renting access to services or capacity.
That is the case with Windows Azure which can be offered as infrastructure or platform as a service. Here you can pay for what you agree to use, rather than the existing and still available pay-as-you-go.
If you agree to a certain amount of use, you can get one of three different discount levels, 20, 23, and 27 percent. The lowest discount is for monthly commitments of $500 to $14,999 (they couldn’t round up?) while the biggest price cut is for $40,000 and above.
Go beyond what you committed to and pay-as-you-go pricing kicks back in.
If you are confused, avail yourself of the Windows Azure Pricing Calculator available here.
Posted by Doug Barney on 10/11/2012 at 12:47 PM0 comments
Folks building private clouds often go with what they know, and when it comes to virtual infrastructure what they know is usually VMware. And when it comes to VMware private clouds, vSphere is the way to go.
But vSphere and VMware pricing until recently didn't live up to the sterling reputations of the company's products. Product licensing was based on a rather bizarre approach that charges based on how much memory the tool used. Weird and complicated? Yes. Costly? For sure.
At the recent VMworld in Las Vegas it was a new product or a nice zinger that drew applause during a company keynote. It was the end of memory-based pricing and a return to CPU-based pricing.
For an analysis as to why RAM-based pricing was so bad, check out this blog.
Posted by Doug Barney on 10/02/2012 at 12:47 PM0 comments
Third parties large and small are moving on-premises wares to the cloud. One advantage is this existing software is presumably stable and debugged. It also offers the ability to have a hybrid install where some software and data stays in the shop while the rest is out in the ether.
This is what Symantec has done with Enterprise Vault.cloud.
The tool was originally just Symantec Enterprise Vault, and did (and still does) e-mail archiving. The cloud version takes all the existing features and adds access from anywhere. Your mail can be read from a browser, smart phone or your trusty PC.
For IT, cloud storage means you can stop enforcing the mailbox, an unwelcome task that turns IT into the bad guy.
We spoke with Amy Dugdale, Symantec's senior manager of Product Marketing about Enterprise Vault.cloud for more about this new approach, including the benefits of the hybrid model and the ability to choose between in-house and the cloud.
"If they are looking for a predictable cost structure with per user/per month pricing, unlimited storage and retention and automatic maintenance/upgrades, then Enterprise Vault.cloud is a great way to go," says Dugdale. "If they already have full-time IT staff that can manage the archive on-premise then they may prefer to use Enterprise Vault."
Posted by Doug Barney on 10/02/2012 at 12:47 PM1 comments
There are myriad Web-based storage tools, and the free ones really ain't that bad. The paid ones are even better.
For pure backup I use Carbonite, and the $55 a year is little to pay for peace of mind. I guess I'd like to see the service double as a cloud check in and checkout service for files for when I'm on a different machine. My guess is they are working on that somewhere, just waiting for the technology and business needs to crash into each other.
For cloud file check in and checkout I use DropBox and it is simple and straightforward.
For those that need more than simple and straightforward, there's a new tool, Oxygen Cloud, which combines on-premises and cloud storage in some intriguing ways.
First is the fact that Oxygen Cloud is all based on a virtual file system, and that file system as I understand it is the foundation for how the storage is organized on-premises. The idea is the file system has the intelligence to control how files are distributed to end points, and access them from over the Internet, end points that include all the major clients, Windows, Mac, Android and vanilla HTTPS.
IT can manage the access, allowing a full dump to authorized users which cuts down on file access over the network, or more controlled access for less trusted users.
And all this is secured with no less than three instances of encryption.
Users shouldn't see all this plumbing. All they should see is another local drive.
Do you use a cloud storage or file service, and if so, which one? Tell me more at [email protected].
Posted by Doug Barney on 09/26/2012 at 12:47 PM2 comments
VMware used to be a hypervisor company. Then it became a company with a hypervisor surrounded by a bunch of management tools. Then there were more and more management tools, along with storage.
Then the third parties really kicked, building apps that tie into richer versions of the hypervisor. Before long VMware was a full-blown platform company.
In fact, many customers buy a range of related VMware tools that transform large swaths of internal datacenters.
One key product for building private clouds the VMware way is vCloud Director (vCD). Virtualization Review contributor David Davis knows a thing or two about vCD and was kind enough to share that wisdom in a recent piece.
First, vCD is software that can turn an existing vSphere system into a private cloud by adding multitenacy and self-service.
From the IT view, it supports the creation of groups of virtual machines to support specific workloads, all with security and network configuration applied.
Remember when I said VMware was a platform company? If you don't get it now, you'll understand when you try to set up vCD, as you need to have a VMware platform in place.
Here's what you must have: At least two ESXi hosts running under a vSphere setup, as well as vCenter and vShield.
If you just want to play around, you can download a vCD virtual appliance, which is free for two months.
Have you built a VMware-based private cloud or are you thinking of one? Share your thoughts at [email protected].
Posted by Doug Barney on 09/26/2012 at 12:47 PM2 comments
Thinking about putting Exchange in the cloud? Nah. You might lose all your mail. SQL Server? Too mission critical and too much transactional data going back and forth -- your performance would stink worse than week old striper.
SharePoint -- now there's the sweet spot. Not exactly mission critical, no real transactions to bring the T-1 to its knees. And so many of these apps are tactical. These puppies have to go up fast and simple. Last thing it needs is to get bogged down building a bunch of SharePoint servers just for a small team on a short term project. With the cloud, IT can ignore the whole thing and let the small team with the short-term project do it all themselves.
To me that makes SharePoint one of the first Microsoft apps I'd move to the cloud.
Posted by Doug Barney on 09/18/2012 at 12:47 PM5 comments
When I think of big data I don't really think of the cloud. First I think of big servers and big processors. Then I think of in-memory where this data is in RAM for lightning fast analysis. This sure ain't the cloud.
The other problem is big data doesn't fit well over the clogged pipes that lead to and from the cloud. However if your big data is somehow already all up in the cloud and if you are just sending instructions to manipulate it, then you don't have to buy the big servers with the big processors. There, the cloud might just make sense.
Rand Wacker from CloudPassage knows more about this than do I and took a whack at these issues in a recent interview with Enterprise Systems Journal.
Besides putting your entire big data app in the cloud, and there are already big data SaaS apps that can do the job, Wacker has an interesting hybrid alternative. "Instead of provisioning your computing infrastructure for peak load -- and then some -- you can cover the base load with your own private cloud and either lease the spikes or burst out into the public cloud when additional temporary resources are needed, as it is often the case with big data projects," he explains.
When it comes to having your all big data all in the cloud, Wacker is also a fan. "The on-demand nature of the cloud enables companies to very easily secure the amount of processing power they need, no matter how large or how small. Instead of building a data center dedicated to data analysis, companies can lease servers by the hour. For highly variable workloads, metered (or utility) billing matches costs to usage and can lower overall investment, especially for firms that are just beginning to leverage big data technology."
Does big data belong in the cloud or will we just overload the thing and slow down ESPN.com?
You tell me at [email protected].
Posted by Doug Barney on 09/18/2012 at 12:47 PM0 comments
Just when I was starting to figure out SaaS vs, IaaS and PaaS, now comes private vs. public PaaS. Fortunately there is Bart Copeland, CEO of ActiveState to make sense of it all.
First, platform as a service. You probably know what it means, but it's a fairly new term so there's nothing wrong with making sure we all know what we're talking about.
I think of cloud as a stack. At the lowest level is Infrastructure as a service, IaaS, really pure infrastructure upon which I can build my apps. Kind of like a bare-metal server.
Platform is infrastructure equipped with components needed to run things, taking that bare metal server and adding an OS, storage, networking interfaces, that sort of thing.
At a higher level is SaaS. That's pretty self explanatory.
Getting back to PaaS, a public offering would support my apps and data and let me offer services. But it might also be hackable. A private PaaS should be safer, but it should also be faster and have more uptime because presumably it wouldn't be shared by every Tom, Dick and Harry. That's because a pure private PaaS would be in your own shop. Another approach, though semantically less pure, is a private PaaS carved out of a provider's network that is yours and yours alone.
You might have thought it was dumb that I defined all those terms up front, but we are early in this game. In the best of cases these names gets misused; in the worst, they are thoroughly abused.
Posted by Doug Barney on 09/11/2012 at 12:47 PM1 comments