How do you know if your shop is ready for the cloud and who decides? Cloud answers are based on technical analysis with more than a fair share of philosophy and politics.
On the technical side, Microsoft in October released its Cloud Readiness Tool.
The tool is an interactive survey of 27 questions. The company claims it only takes about 10-15 minutes to complete. The goal is to find out how mature your shop is. The more mature, the more you are ready for the cloud as mature refers to well-thought, well-built and, despite the word mature, that you generally use modern tools.
Based on your answers the tool will spit out its findings in a custom report that addresses "security, privacy and reliability topic areas: security policies capabilities, personnel capabilities, physical security capabilities, privacy capabilities, asset and risk management capabilities, and reliability capabilities," Microsoft says.
Microsoft is also interested in what industry you are part of and points you to related organizations and standards, such as HIPAA for health care and the Payment Card Industry (PCI) Data Security Standard (DSS).
Posted by Doug Barney on 11/06/2012 at 12:47 PM0 comments
Last year I had an important cloud question: How much delay does the cloud introduce and therefore how much slower are cloud apps and services than their on-premises brethren. I hit up all the experts including vendor and there were many well-researched and thought out answers. Unfortunately each was different.
Some believed there was no significant latency in cloud apps, a logically vacant conclusion. Others saw that all the network hops and often slow enterprise egresses made the cloud virtually unusable. My determination is, it's somewhere in between.
I also spent a fair bit of time discussing how your WAN should be optimized for cloud access and what to look for in cloud apps and service such that your end users won't wring your neck after installation.
David Linthicum tackled a similar subject in a recent Enteprise Systems Journal piece.
Linthicum looked at the public cloud, otherwise known as the Internet, and saw it wanting. While the internal processing on service provider server farms is usually snappy, the public cloud is less so. And when you have a chatty app, which needs to update user screens and so forth, the whole system bogs down.
Apps that don't require all that chatter, however, can work quite well on the public cloud.
A logical alternative, Linthicum believes, is building your own cloud. By having an on-premises cloud, you control the speed because it is based on your servers and runs on the network you designed.
Posted by Doug Barney on 11/06/2012 at 12:47 PM5 comments
Every cloud survey I've seen in eons points to security as the biggest concern in moving to the cloud. And on the surface that makes sense. Your data is no longer in your shop but elsewhere in a provider's network. More telling, the data also has to move from the provider over the Internet to wherever it's going.
Logically all this is fraught with danger. But I am not one that is terribly terrified. I don't trust every internal IT person so the data is not necessarily safer in one's own data center. And the security within a data center where that is the vendor's only business should be better, with more modern tools and up-to-date security experts hired as guards.
Derek Tumulak is a V.P. at Vormetric, which does key management and encryption. Tumulak has some advice on keeping your data safe and your CIO calm as you make your cloud move.
One approach, which also boosts performance, is a private cloud. But here again, the issue is whether you can realistically do a better job at securing what is now a highly virtualized data center than a dedicated, well-heeled service provider.
On the flip side most clouds are multi-tenant, so it is not just the service provider that is near your data but other tenants.
So how do you secure your cloud? First you need a plan, Tumulak advises -- in this case, a three-year plan that determines which systems are most critical and what cloud moves you intend to make.
One part that is music to my ears is Tumulak's focus on education. Many breaches are due to human error after all.
Another needed plan concerns response. How do you tell customers that may be impacted, and how do you find the root cause?
Posted by Doug Barney on 10/30/2012 at 12:47 PM0 comments
With hackers in abundance, a loss of physical data ownership clouds can be high risk. That is plenty of exposure. And moving from cloud to cloud, such as private to public and perhaps hybrid (which itself entails toggling between private and public) adds a whole new element of danger. Andrew Hay, CloudPassage's chief evangelist, tackles the issue in a conversation with Enterprise Systems Journal.
In Hay's view, many shops transform their data centers into private cloud, which is really done by taking virtualization to the peak of what it can do through management, orchestration and availability. From there, many begin to eye public clouds, migrating what they have accomplished on premises to a service provider.
A true private cloud should be relatively easy, technically, to move this way. And that move lets you escape large capital expenses and move to a leasing model. The best part is without CAPEX and the need to personally build out infrastructure, there is less barrier to launching new systems.
While the technical work can be straightforward, additional work must be done to account for the cloud, such as really really making sure it's all secure. "Public cloud presents several nuances that directly impact the way traditional security tools operate. Traditional security tools were created at a time when cloud infrastructures did not exist. Multi-tenant -- and even some single-tenant -- cloud-hosting environments introduce many nuances, such as dynamic IP addressing of servers, cloud bursting, rapid deployment, and equally rapid server decommissioning, which the vast majority of security tools cannot handle," Hay explains.
Hay's point is that existing security isn't enough for this new world. For example, perimeter is not enough and must be buttressed with end-point tools. But some cloud providers decide what tools to offer, and you either have to live with them or find another host.
And if you go the hybrid model, it can be hard to find tools that can protect as your apps and data toggle from private to public. And thinking that what you have, just because it cost a lot, can do the job can be a big mistake.
Firewalls, for instance, aren't always cloud ready. "Network address assignment is far more dynamic in clouds, especially in public clouds. There is rarely a guarantee that your server will spin up with the same IP address every time. Current host-based firewalls can usually handle changes of this nature, but what about firewall policies defined with specific source and destination IP addresses?" Hay asks. "How will you accurately keep track of cloud server assets or administer network access controls when IP addresses can change to an arbitrary address within a massive IP address space? Also, with hybrid cloud environments, the cloud instance can move to a completely different environment -- even ending up on the other side of the firewall configured to protect it."
That's a lot to chew on, but if you want the benefits of cloud, a little homework is a small price to pay.
Posted by Doug Barney on 10/30/2012 at 12:47 PM0 comments
The cloud, as vendors tell us, can do nearly everything except wash the dishes. One new area is serving up virtual desktops/PCs over the cloud as opposed to the corporate network. The beauty here is you can get at your virtual PC from nearly anywhere. My fear is when you don't have solid connections across all hops and into wherever you are, your PC runs like you have Windows 1.0 on an 80286.
Pricing and flexibility may make up for the performance up and downs. And according to Atlantis Computing, the savings are dramatic.
Its customer, Colt Technology, set a cloud-based VDI system equipped to handle 20,000 desktops. Because it is cloud based, you don't need your own massive data center and all the gear that that entails.
To speed performance, the system uses server RAM for storage, a technique pioneered by the in-memory crowd. Despite my fears of sluggishness, Atlantis claims that most common operations blaze by with a 0.53 second average response time.
The cost is perhaps the key benefit. The companies claim that operating expense fell 23 percent and capital expense by 61 percent.
Posted by Doug Barney on 10/23/2012 at 12:47 PM1 comments
The fastest to way to cut a data center electric bill is to pull the plug. Of course that would have the CEO screaming more than Rex Ryan after another Mark Sanchez completion -- to the other team!
You can't just shut your data center down unless your Chapter 7 says you can. But you can move apps from your data center to the cloud, shifting the energy load to the service provider.
These service providers are acutely interested in highly efficient, green data centers. There are all kinds of techniques they use: efficient cabling, directing cooling to only that which is hot, and virtualizing everything to reduce the number of devices that need cooling.
The biggest advantage to cloud providers is not having to locate data centers near or in headquarters and other corporate facilities. Already this decoupling has put data centers in cooler cellars, deep dark caves and cold climates where data center managers simply open the windows (yes, the air passes through a filter but it is outside air, not A/C) to cool the equipment.
Tate Cantrell, CTO for Verne Global, is keen in Iceland -- Keflavik, Iceland specifically -- for data center location. And it's not the cool climate so much as how Iceland generates power, Cantrell explains in an interview with Enterprise Systems Journal.
Iceland has a modern grid and are aggressive in pursuing renewable energy such as geothermal and hydro.
An efficient service provider data center located in an energy-efficient country should provide low-cost services and be good for the environment to boot. "Cloud providers have the ability to centralize and innovate in a way that improves efficiency on a scale that an individual company may not have the resources to do on their own. Cloud providers can locate their data centers where power grids are most robust in terms of capacity, where climates cater to free-cooling, and where sources of energy are not solely dependent upon coal," says Cantrell.
Posted by Doug Barney on 10/23/2012 at 12:47 PM2 comments
Big data is ideal in some ways for the cloud. Once the data is in the cloud, the processing takes place remotely the results and analysis and updates are all that go up or down the wire. Simply put, big data apps shouldn't be chatty like an end-user productivity app with constant interaction.
But big data apps aren't sold at Best Buy. They usually aren't sold at all, but built. Developers tools, therefore, are paramount.
Precog has an infrastructure platform and tools and APIs for big data that is now in beta test.
While aimed at developers, the platform is claimed to need less hard-core development. For instance, Precog "eliminates the need for development teams to learn Hadoop and related complex data storage and analysis technologies, freeing them to focus on core application functionality," the company says.
The company offers a free version as well as a $12,000-a-month platinum edition.
Posted by Doug Barney on 10/16/2012 at 12:47 PM2 comments
Hadoop is one crazy name for one crazy technology. Apparently the technology was named after a toy elephant, though it still sounds like it tastes pretty good!
Jack Norris with MapR Technologies, which offers a Hadoop distribution, recently told Enterprise Systems Journal why Hadoop makes so much sense in the cloud.
Before we tackle they why's, what is Hadoop anyway. According to Norris "it is a platform that allows enterprises to store and analyze large and growing unstructured data more effectively and economically than ever before. With Hadoop, organizations can process and analyze a diverse set of unstructured data including clickstreams, log files, sensor data, genomic information, and images."
That alone makes it perfect for the cloud. "Hadoop represents a paradigm shift. Instead of moving the data across the network, it's much more effective to combine compute on the data and send the results over the network," Norris explains.
This makes a lot of sense. Processing in the cloud is most often speedy on state of the art servers (the rub being, if they are over-virtualized or otherwise overcommitted).
What slows the cloud is the network in and out, so maximized processing in the cloud and minimizing transmission makes perfect sense.
The cloud is perfect for Hadoop because all the unstructured data Hadoop deals with is growing close to exponentially. "The issue is that these data sources are typically unstructured like social media or sensor data and are growing in volumes that outstrip the ability to process them with the existing tools and processes," Norris says. "Hadoop removes all these obstacles by providing a radically different framework that allows for easy scale-out of systems and for processing power to be distributed. Data from a wide variety of sources can be easily loaded and analyzed with Hadoop. There's no need to go through a lengthy process to transform data and a broad set of analytic techniques can be used."
But this is all too much for the typical data center. The cloud is ideal for handling all this growth.
Fortunately some cloud providers have distinct services for Hadoop.
Posted by Doug Barney on 10/16/2012 at 12:47 PM0 comments
Security is still bugaboo number one when it comes to the cloud. CipherCloud is hoping to make the cloud less scary with encryption offered through the new CipherCloud Database Gateway.
This tool works with a range of cloud types everything from IaaS to PaaS and SaaS.
The database-oriented gateway encrypts the data while it is "at rest." It works with your own database instances such as SQL Server and Oracle cloud installs as well as database services such as Oracle Database Cloud Service, Microsoft SQL Azure and Amazon RDS.
With your data fully encrypted, worries about hackers, compliance, data leakage and privacy should all be curtailed.
Posted by Doug Barney on 10/11/2012 at 12:47 PM4 comments
Packaged software is pretty straightforward. You buy it for a set price, use it, and some time later either buy another or an upgrade. Even with volume licenses, as complex as they can be, you are still essentially just getting a discount with full rights to the package.
Boy, oh boy, is the cloud different. Here you buy access. Sometimes you are buying access to what used to be a packaged app and this is usually fairly simple. Other times you are buying or renting access to services or capacity.
That is the case with Windows Azure which can be offered as infrastructure or platform as a service. Here you can pay for what you agree to use, rather than the existing and still available pay-as-you-go.
If you agree to a certain amount of use, you can get one of three different discount levels, 20, 23, and 27 percent. The lowest discount is for monthly commitments of $500 to $14,999 (they couldn’t round up?) while the biggest price cut is for $40,000 and above.
Go beyond what you committed to and pay-as-you-go pricing kicks back in.
If you are confused, avail yourself of the Windows Azure Pricing Calculator available here.
Posted by Doug Barney on 10/11/2012 at 12:47 PM0 comments
Folks building private clouds often go with what they know, and when it comes to virtual infrastructure what they know is usually VMware. And when it comes to VMware private clouds, vSphere is the way to go.
But vSphere and VMware pricing until recently didn't live up to the sterling reputations of the company's products. Product licensing was based on a rather bizarre approach that charges based on how much memory the tool used. Weird and complicated? Yes. Costly? For sure.
At the recent VMworld in Las Vegas it was a new product or a nice zinger that drew applause during a company keynote. It was the end of memory-based pricing and a return to CPU-based pricing.
For an analysis as to why RAM-based pricing was so bad, check out this blog.
Posted by Doug Barney on 10/02/2012 at 12:47 PM0 comments
Third parties large and small are moving on-premises wares to the cloud. One advantage is this existing software is presumably stable and debugged. It also offers the ability to have a hybrid install where some software and data stays in the shop while the rest is out in the ether.
This is what Symantec has done with Enterprise Vault.cloud.
The tool was originally just Symantec Enterprise Vault, and did (and still does) e-mail archiving. The cloud version takes all the existing features and adds access from anywhere. Your mail can be read from a browser, smart phone or your trusty PC.
For IT, cloud storage means you can stop enforcing the mailbox, an unwelcome task that turns IT into the bad guy.
We spoke with Amy Dugdale, Symantec's senior manager of Product Marketing about Enterprise Vault.cloud for more about this new approach, including the benefits of the hybrid model and the ability to choose between in-house and the cloud.
"If they are looking for a predictable cost structure with per user/per month pricing, unlimited storage and retention and automatic maintenance/upgrades, then Enterprise Vault.cloud is a great way to go," says Dugdale. "If they already have full-time IT staff that can manage the archive on-premise then they may prefer to use Enterprise Vault."
Posted by Doug Barney on 10/02/2012 at 12:47 PM1 comments