CSC Acquires Cloud Service Automation Vendor ServiceMesh

Systems integrator CSC has filled a key hole in its cloud service portfolio by agreeing to acquire ServiceMesh, a key cloud service automation provider that facilitates running various apps on multiple public cloud services.

The deal gives CSC the ServiceMesh Agility Platform, software that automates the deployment and management of enterprise apps running on public, private and hybrid clouds. The ServiceMesh policy engine provides governance, compliance and security of enterprise information running in various cloud environments, the company said. 

"With ServiceMesh, we will empower an ecosystem of enterprise software providers by lowering the friction for companies to execute a multi-vendor hybrid cloud strategy while maintaining central governance, policy and administration," said CSC Chief Technology Officer Dan Hushon in a statement announcing the deal.

Forrester Research analyst Dave Bartoletti noted in a blog post that adding ServiceMesh gives CSC a much-needed unified service catalog, orchestration and governance platform; cloud neutrality; and an orchestration engine that rivals like Dell, HP and IBM already have.

"CSC has picked up one of the few independent hybrid/multi-cloud management vendors," Bartoletti noted. "CSC can combine its strong managed services capabilities and IT management tools expertise with the application lifecycle (DevOps) focus of ServiceMesh to reach a powerful cloud buyer: the app owner and developer. Apps are where the cloud action is."

CSC said the deal is slated to close by the end of the quarter.

Posted by Jeffrey Schwartz on 10/31/2013 at 1:24 PM0 comments


Salesforce.com Launches Private AppExchange

In a move aimed at broadening the enterprise reach of its application marketplace, Salesforce.com on Thursday launched its Salesforce Private AppExchange. The new exchange lets IT organizations roll out their own apps, which can be accessed on mobile devices and Web-based clients.

Built on the foundation of its existing Salesforce AppExchange, the new private iteration lets enterprise customers build and distribute internally developed software that can be accessed via the single sign-on service the company announced earlier this month called Salesforce Identity. The private enterprise marketplace is designed to let IT organizations consolidate access to applications and give employees broader choices of tools that are on the common platform. 

"You can distribute any application, so it could be an app that's delivered by one of our ISVs that's listed on the AppExchange, or it could be an application that runs on another cloud service that doesn't have an integration point to the platform," explained Sara Varni, senior director of Salesforce.com's AppExchange.

Asked if the Private AppExchange requires pre-integration with other cloud services or application stores, Varni said, "It's really a mechanism to distribute applications to end users more so than where the application exists."

One key benefit of the Salesforce Private AppExchange is it will allow organizations to integrate Salesforce.com's Chatter enterprise social networking service into their apps, explained Constellation Research vice president and principal analyst Alan Lepofsky.

"Employees can now engage with their colleagues to discuss the applications, provide self-support, weed out the best apps, etc.," Lepofsky said. "This is more effective than accessing a generic public app store and having to fend for yourself to discover the best tools. Several social business vendors provide integrated application stores, but many are just catalogs and are lacking the social elements that Salesforce is looking to provide."

Users can move from the Private AppExchange to the Salesforce identity app launcher, enabling them to see what apps they have access to, Varni added. Any Salesforce customer that has access to the AppExchange via an enterprise license has rights to the private exchange, as well, she said. Those without enterprise licenses can provide access to the new private exchange and the new Salesforce Identity service for $5 per user, per month.

Posted by Jeffrey Schwartz on 10/31/2013 at 1:21 PM0 comments


EarthLink Launches vCloud-Based Private Cloud Service

Remember when EarthLink was the Internet service provider many people used if they didn't want to rely on AOL for their dial-up access?

EarthLink, which has a formidable communications backbone these days, is also looking to make its mark as a cloud infrastructure service provider. 

The company this week launched a private cloud hosting service, targeted at those who require dedicated clouds rather than shared multitenant services. The company says its new Private Cloud Hosting is intended for workloads that require high availability. It's available with four service levels that can deliver N+1 redundancy with its self-healing platform, which offers automated server failover and SSAE 16 SOC2 certified datacenters and a domestic MPLS network and a SAN.

EarthLink's cloud infrastructure is based on VMware's vCenter cloud management platform, which supports customizable dashboards and real-time performance monitoring and analytics. Users can specify most builds of Windows Server from Windows 2000 to Windows Server 2012, as well as popular Linux server distributions.

The service is targeted at implementations that require 10 virtual machines or more. The company doesn't publish its pricing; rather, it issues quotes based on configuration requirements. While building up its communications backbone over the years -- most recently with its OC-192 10 Gbps fiber backbone, voice over IP portfolio, IP-based MPLS and EVDO wireless service -- EarthLink has also assembled cloud hosting assets.

Over two years ago, the company acquired LogicalSolutions.net, an enterprise cloud service provider. Also that year, EarthLink acquired Business Vitals, a managed IT security and professional services company; xDefenders, which also provides managed IT security services; and Synergy Global Solutions' cloud-based applications business. EarthLink this year also picked up CenterBeam, which provides remote IT services and has added four datacenters for its cloud hosting platform.

The only question that remains is will IT decision-makers be able to shake off their image of EarthLink as that provider of dial-up services (and, later, ISDN and DSL communications) when considering it for cloud hosting services?

Posted by Jeffrey Schwartz on 10/24/2013 at 10:13 AM2 comments


Cloud Technology Partners' PaaSLane Aims To Simplify App Migration

Cloud Technology Partners is best known for its consulting services. But the company has a new business aimed at making it easier for developers to migrate their .NET and Java applications to different cloud service providers.

The company's new PaaSLane tool aims to remove the time-consuming and labor-intensive process of evaluating software source code to ensure an application can work properly in a cloud environment. That's especially important as providers roll out new services and update existing ones. 

Cloud Technology Partners this week released a free beta version of PaaSLane, available for download here.

PaaSLane is designed to continuously track changes in cloud environments and provide recommendations to developers to remediate their code, said Ben Grubin, director of product management at Cloud Technology Partners. "We provide overall cloud readiness metrics and reports for each app run through the software," Grubin said, adding that PaaSLane's engine can process 250,000 lines of code in 10 minutes. "It doesn't just find issues -- it estimates the development effort to address those issues."

This is important, Grubin said, to keep up with different cloud platforms, including offerings from Amazon Web Services, Microsoft's Windows Azure and the various OpenStack-based services.

"This is a complex problem because there are thousands of changes from different cloud providers, and developers don't spend time keeping up with all of these platforms," he said. "You can spend a lot of time researching how to develop for each platform, and that can have a huge productivity impact."

Posted by Jeffrey Schwartz on 10/24/2013 at 10:51 AM5 comments


OpenStack Havana Adds Cloud Monitoring, Orchestration and Global Cluster Support

The OpenStack Foundation on Thursday released the eighth version of its open source cloud infrastructure operating platform.

The new OpenStack Havana release has 400 new features, but the most noteworthy are support for monitoring, orchestration and global cluster support with the platform's new object storage system. 

Havana also offers support for improved quality of service settings and end-to-end encryption across all block storage drivers, SSL built into all of the OpenStack service API, an upgraded VPN, firewall support and the ability to boot from a volume, enabling live migrations.

OpenStack Foundation Executive Director Jonathan Bryce told me these new features are critical for enterprise and business-critical applications, including ERP and virtual desktop infrastructure.

"Enterprises can deploy more and more applications into their OpenStack environments," Bryce said in a briefing Wednesday. "That's critical if you're going to run your business-critical systems in the cloud."

Bryce described the metering and monitoring capability, which can be exposed through a Web-based dashboard, as critical for those who require visibility into usage across all of the OpenStack services.

"Before, if you wanted to pull out who was using more resources or determine where there were bottlenecks and hotspots, you needed to use different tools," Bryce explained. "The OpenStack metering and monitoring services pre-integrate with the other OpenStack services and aggregate all that data into a central location."

That's useful from an operational standpoint but also for those who need to bill users and/or do chargeback, as well as for issuing alerts when certain conditions arise, he added. The monitoring and metering capability can be added to dashboards and other tools via a new central API.

"It's a powerful way to see what's going on in your cloud app at an overall level or to see which users or apps are driving any usage," Bryce said. Administrators can track network and storage users and drill down on a per-app or per-user basis.

The new orchestration capability is based on a template-driven engine, which lets developers describe the infrastructure resources needed to deploy an application. In the template, a developer can define compute nodes such as the number of Web servers, database servers and networks, and storage resources feed all of that into the orchestration engine that provisions those resources and issues the necessary calls to set everything up, Bryce said.

"What's cool is because it ties into the metering and monitoring service, you can set thresholds, and if they receive alerts, they can take actions like auto-scaling," he said. "These two features in combination provide a high level of functionality on top of the core OpenStack components."

The template language is JSON and Bryce said there are already quite a few available in the open source community. He anticipates more will surface now that the Havana release has gone live.

The global cluster support on the object storage system enables a single storage system to span multiple datacenters, which is important for disaster recovery to enable replication across datacenters and for allowing users to connect to the nearest cloud facility, said Joe Arnold, CEO of SwiftStack, a private cloud storage provider which led the development of the global cluster support in OpenStack Havana's object storage system.

Concur, which provides the widely used, cloud-based expense reporting Software as a Service (SaaS), is among SwiftStack's customers that have deployed the new global clustering capability in its OpenStack private cloud, which the company will talk about at next month's OpenStack Summit in Hong Kong.

"They have two datacenters set up and they're sending their users to the nearest datacenter," Arnold said. "When a customer takes a picture of a receipt with their phone, it's uploaded to the datacenter nearest to them. That just gives the user a better experience. And if there's a failure in either datacenter, they can send all of the users to the other datacenter and not miss anything."

Red Hat is among those in the OpenStack community that have released new distributions based on Havana. Red Hat's RDO Havana is aimed at deployment across Red Hat's various Linux distributions, including Red Hat Enterprise Linux, Fedora, CentOS and Scientific Linux. Canonical's new Ubuntu Server 12.04 LTS and 13.10 also include the new OpenStack Havana release.

Posted by Jeffrey Schwartz on 10/17/2013 at 12:12 PM2 comments


VMware Updates vCenter Ops Manager, vCenter Log Insight

At this week's VMworld 2013 Europe conference in Barcelona, VMware announced key updates to several of its products. Most important is improved performance monitoring of Hyper-V, SQL Server and Exchange within VMware vCenter Operations Management Suite 5.8. VMware also updated its vCenter Log Insight analytics tool.

The vCenter Operations Management Suite was able to monitor and manage Hyper-V workloads as a guest OS by placing its Hyperic agent in the box. But now, through Microsoft's System Center Operations Manager or VMware's Hyperic management packs for vCenter Operations Manager, it can manage Hyper-V hosts and virtual machines. It also adds support for Amazon Web Services (AWS) EC2 infrastructure as a service (IaaS).

The new version collects data on the CPU, network, memory and other components, and feeds that into its analytics engine to separate normal performance behavior from unhealthy activity and then provides alerts.

Expanding support for Hyper-V is a smart move that will be welcomed by VMware and Microsoft customers alike, says Pund-IT principal analyst Charles King. "By expanding its support for and visibility into Hyper-V and public cloud services like AWS, VMware is highlighting its continuing technical leadership," King said. "Since these new features are also coming with no additional premium, adding them also enhances the value proposition of VMware's solutions and services."

In tandem with the vCenter Operations Management Suite upgrade, VMware also updated its recently launched vCenter Log Insight analytics tool. The new Log Insight 1.5 release provides real-time analytics to provide searchable information and dashboards. Released in June, the Log Insight upgrade supports "content packs" for specific systems such as Exchange Server and SQL Server, as well as products from Hytrust, EMC, NetFlow and VMware's own Nicira software-defined networking platform, said Mark Leake, director of product marketing in VMware's cloud management business unit.

"These new out-of-the-box capabilities enhance the discovery and topology," Leake said. "So you get deeper discovery of app instances and components and you can apply the analytics that we have in vCenter Operations Manager to them."

Posted by Jeffrey Schwartz on 10/16/2013 at 4:49 PM0 comments


Amazon Web Services Offers Cloud Incentives to Startups

Amazon Web Services (AWS) on Thursday launched a program for startup customers to use its cloud service, which includes training, developer resources and a community aimed at bringing other startups and third parties together. Called AWS Activate, it also gives startups credits to use its cloud service.

The AWS Activate offering consists of two packages. The Self Starter package is available for any startup. It includes an AWS Free Usage Tier, one month of support for developers, technical training, credit for one AWS training lab and access to Amazon's Startup Forum, AWS evangelist Jeff Barr said in a blog post

Those qualifying for the Portfolio package are startups that have received venture capital, seed, accelerator or incubator funding. Portfolio members receive one month to one year of AWS business-level support, AWS Technical Professional and AWS Essentials training, and credit for four self-paced AWS training labs.

"Startups operate in a world of high uncertainty and limited capital, so an elastic and on-demand infrastructure at low and variable cost aligns very naturally with their needs," said Amazon CTO Werner Vogels in a blog post. "By reducing the cost of failure and democratizing access to infrastructure, the cloud has enabled more startups to build, experiment, and scale. AWS Activate is designed to provide startups with the resources they need to build applications on AWS."

Vogels noted the first to provide specialized offers to participants in the program include Opscode, provider of the Chef configuration, cloud management and automation tools; AlertLogic, a cloud security provider; and SOASTA's testing tools.

Posted by Jeffrey Schwartz on 10/10/2013 at 3:48 PM2 comments


Verizon Launches New Cloud Compute and Storage Service

Verizon plans to offer a new cloud Infrastructure as a Service (IaaS) that will allow enterprise customers to provision compute and storage capacity.

Already a major provider of cloud IaaS with the Terremark service it acquired over two years ago for $1.4 billion, Verizon is poised to offer a more configurable set of services with its new offering. The company laid out plans to roll its new Verizon Cloud Compute and Verizon Cloud Storage services on Thursday at the Interop trade show in New York.

The new services will allow enterprise customers to configure any combination of compute, memory, storage and network, said Verizon Terremark Chief Technology Officer John Considine during a keynote address at Interop.

"Instead of forcing users to compromise and choose the preselected settings of CPU, memory and storage, the Verizon cloud allows you to independently set these parameters and build the machine you actually want," Considine said. "You set these parameters and only pay for what you use. This means for the first time you can run your applications in a multitenant cloud and not feel the effects of multitenancy."

In private beta since last December, the services are now available for public evaluation. The company plans to make the services generally available later this quarter. The services will operate out of seven datacenters in the United States, Latin America and Europe, with plans to expand into Asia next year. Customers can configure 21 different compute environments including Windows Server, CentOS, Ubuntu and Red Hat Linux.

Kevin Clarke, Verizon Terremark's director of cloud engineering, acknowledged the company is phasing out the Terremark brand for its enterprise cloud business. "Our brand going forward is Verizon," Clarke said. "We are leveraging all the assets Verizon has to establish ourselves as a credible alternative to other cloud providers in the market."

Clarke explained how Verizon can make such a claim. "What we've actually built is a distributed switch and we've unified our networking and storage protocols as a flat layer 2 network that we're running through traffic shapers to offer performance and quality of service," he said.

While Considine said in his keynote that the storage service will support Amazon Web Services S3 storage API, he didn't describe the Verizon Cloud Compute and Verizon Cloud Storage services as open or portable in any other ways, such as support for the growing OpenStack platform.

Clarke said Verizon over time will be able to address that. Up the stack, the service's new orchestration layer is completely based on home-grown Java code, Clarke explained.

"The object model we are embodying in terms of the implementation of the code itself is divorced from the API set," Clarke said. "So we have the ability in the future, and it's on our roadmap, to offer other API flavors, including OpenStack, Amazon, CloudStack, what have you." In addition to S3, the Verizon Cloud Storage service supports the OpenStack SWIFT specification, Clarke noted.

Verizon has its own API set, which expresses what's operationally possible in the system, and the company is offering developers an SDK to implement that API set, Clarke explained. It will support other APIs next year.

From an infrastructure perspective, the new service is based on the Xen environment with support for OmniVision, which supports multiple virtual environments. Initially, customers can run Xen images or VMware images with its support for VMDKs, Clarke said. The company plans to support Microsoft's Hyper-V in the future, he noted.

For now, Verizon is emphasizing the configurability of the new service. "Control and configuration is important in the cloud," Clarke said. "That model of one size fits all is frustrating to the user community and it doesn't necessarily give them what they need. We want to offer flexibility and the ability to control that flexibility dynamically."

Posted by Jeffrey Schwartz on 10/04/2013 at 11:42 AM0 comments


Microsoft's Cloud Gets Security Approval from Feds

The U.S. government has certified that Microsoft's Windows Azure cloud service meets the FedRAMP Joint Authorization Board (JAB) Provisional Authority to Operate (P-ATO).

That means Windows Azure meets the security requirements of federal agencies looking to use public Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), said Susie Adams, chief technology officer for Microsoft Federal, in a blog post Monday

While many cloud providers already meet FedRAMP requirements, Microsoft claims Windows Azure is the first to meet the P-ATO requirements from the JAB. The FedRAMP JAB P-ATO includes representatives from the Department of Defense, the Department of Homeland Security and the U.S. General Services Administration (GSA).

"Securing a P-ATO from the JAB ensures that when government agencies have a need for an Infrastructure as a Service (IaaS) or Platform as a Service (PaaS), they know that Windows Azure has successfully met the necessary security assessments," Adams said in her post. "This not only opens the door for faster cloud adoption, but helps agencies move to the cloud in a more streamlined, cost-effective way. Additionally, since Microsoft datacenters were also evaluated as part of the JAB review process, other Microsoft cloud services are ultimately better aligned to meet these security controls as well."

Certainly customers -- either government agencies or others that require the highest level of security -- will welcome this latest milestone. But it remains to be seen whether Microsoft's latest cloud security milestone will be enough to overcome concerns over the government's surveillance efforts under such programs as PRISM, as noted in this month's Redmond magazine cover story.

What's your take on Windows Azure achieving FedRAMP compliance? Leave a comment below or reach me at [email protected].

Posted by Jeffrey Schwartz on 10/02/2013 at 12:07 PM0 comments


Cloud Security Alliance Launches Certification Program, Updates Controls Matrix

Two years after launching a registry that lets cloud providers reveal the security controls they have in place, the Cloud Security Alliance (CSA) is now enabling them to get third-party certification of those claims.

The CSA, joined by BSI Group, on Thursday said it will offer the STAR Certification program. 

Announced at the CSA Congress EMEA in Edinburgh, Scotland, the group also released its Cloud Controls Matrix (CCM) Version 3.0, which it described as the most rigorous standard yet for assessing security risks and controls. But by providing the STAR Certification with BSI Group, the CSA promises to be a welcome offering to those who want validation of the controls providers have in place.

Having that third-party validation should be welcome especially in wake of the revelations leaked by Edward Snowden that the National Security Agency (NSA) had surveillance programs such as the widely publicized PRISM. While the NSA's covert surveillance activities are aimed at thwarting terrorist threats, revelations of their scope and charges that service providers were cooperating with the government have run counter to the compliance requirements of many businesses. It has also led decision makers and consumers alike to wonder to what extent they should trust their providers, as reported by Redmond magazine, which found 70 percent of those responding to an online survey were concerned about the surveillance activities.

BSI Group will provide technology-neutral certifications to ensure cloud providers are meeting the ISO/IEC 27001:2005 management system standard along with the CSA's Cloud Control Matrix, the CSA said. The CSA said BSI Group will assign "management capability" scores to the CSA's 11 control areas which cover compliance, data governance, facility security, human resources, information security, legal, operations management, risk management, release management, resiliency and security architecture.

"In light of recent government revelations, both consumers and providers of cloud-based services have been asking for independent, technology-neutral certification to help them make more informed decisions about the services they purchase and use," said Daniele Catteddu, EMEA managing director at CSA, in a statement. "In providing a rigorous, user-centric assessment, STAR Certification will provide an additional layer of transparency that the industry has been calling for."

As for the new Cloud Controls Matrix 3.0, the CSA outlined three key updates:

  • "Five new control domains that address information security risks over the access of, transfer to, and securing of cloud data: Mobile Security; Supply Chain Management, Transparency & Accountability; Interoperability & Portability; and Encryption & Key Management
  • "Improved harmonization with the Security Guidance for Critical Areas of Cloud Computing v3
  • "Improved control auditability throughout the control domains and an expanded control identification naming convention"

 "The decision to use a cloud service distills down to one question: 'Do I trust the provider enough for them to manage and protect my data?'" stated Sean Cordero, co-chair of the Cloud Controls Matrix Working Group and founder of security consultancy Cloud Watchman, based in San Francisco.

The CSA said it will hold workshops covering the new controls and the certification at the CSA Congress 2013, scheduled for Dec. 3 to 5 in Orlando, Fla.

Posted by Jeffrey Schwartz on 09/26/2013 at 2:03 PM0 comments


Red Hat Targets OpenShift PaaS Tools for Enterprise App Development

Red Hat this week disclosed plans to deliver on its OpenShift Platform as a Service (PaaS) portfolio, aimed at letting developers build modern, composite and mobile apps to public, private and hybrid cloud environments.

The company outlined its new JBoss xPaaS services for OpenShift suite, based on its development environment for enterprise Java and middleware. In addition to providing new tooling for developers to build modern and mobile apps for PaaS clouds, Red Hat officials said the new suite will help developers connect existing legacy apps to cloud environments without recoding them. 

During a webcast Tuesday, Red Hat's president of products and technologies, Paul Cormier, described the release as a key step forward in its long-stated plan to deliver on its OpenShift strategy. OpenShift is a key open source cloud PaaS effort championed by Red Hat. Cormier described JBoss xPaaS services for OpenShift as "the first comprehensive set of the services needed to build modern, complex, enterprise applications on enterprise-grade PaaS platforms."

The tooling is aimed at helping JBoss developers build apps for PaaS clouds without having to learn new development techniques, Cormier said, describing xPaaS as "a developer interface to the operating system of the cloud." He added, "There is a real, real gap between low-level services provided by existing PaaSes and what is needed for composite enterprise apps of today."

While IDC analyst Al Hilwa said there are 40 various PaaS offerings the research firm is aware of, and none are complete, xPaaS is a key step toward addressing enterprise requirements. "Part of the issue with the PaaS market is that big players like Red Hat, IBM and Oracle are only now beginning to address the market for Java PaaS in the cloud effectively," Hilwa said.

Middleware fills a key gap in the offerings available, according to Cormier, because it allows application developers to "build at a higher level of abstraction and not have to constantly re-invent the wheel." It also lets datacenter administrators manage applications, as well as debug, scale, update and accelerate them more efficiently, he added. Red Hat is delivering components of the suite in what it calls cartridges.

The first cartridge is designed to push notification services to mobile devices via the AeroGear Unified Push Server, which Red Hat calls mPaaS. That will be followed by iPaaS, a middleware integration service, enabled by its acquisition of Fuse, which provides the Apache Camel integration suite. Also in the pipeline is a business process and rule management cartridge called bpmPaaS.

Red Hat plans to offer these xPaaS  components, which it will  incorporate into its JBoss middleware, into both its public PaaS offering called OpenShift Online and its OpenShift Enterprise private cloud suite, said Craig Muzilla, Red Hat's vice president and general manager for middleware, in a blog post.

"All of these xPaaS services, including aPaaS with JBoss Enterprise Application Platform, iPaaS with JBoss Fuse, bpmPaaS with JBoss BPM technologies and JBoss BRMS and mobile services, will be provided under a single PaaS environment," Muzilla noted. "No longer will enterprises be forced to go to many different PaaS environments in order to obtain what is necessary to build a true, n-tiered enterprise application."

But OpenShift is also competing for developer mindshare with the VMware-led CloudFoundry effort, which the company recently spun off as Pivotal, led by Paul Maritz, as well as Salesforce.com's Heroku. Microsoft is also an early leader in PaaS with its Windows Azure service, which launched as a PaaS in 2010.

It's too early to predict which has an edge, said IDC's Hilwa. "In the end it will be an ecosystem battle," he said.

Posted by Jeffrey Schwartz on 09/26/2013 at 1:26 PM2 comments


Cloud Storage Provider Nirvanix Goes Belly-Up, Customers Panic To Move Data

Cloud service provider Nirvanix stunned its customers early this week telling them they have two weeks to find another home for their terabytes -- and in some cases petabytes --  of data because the company is shutting down.

First reported on Monday by the U.K.-based Web site Information Age, the sudden collapse of one of the largest cloud storage providers has left customers scrambling, especially those with huge amounts of data stored. Among them are National Geographic and Fox, which use Nirvanix to store their data, according to published reports. Many major providers, including Dell, HP, Riverbed, Symantec and TwinStrata, sourced their cloud storage from Nirvanix when offering their own services, according to StorageReview.com

Since Monday's reports that Nirvanix has run out of money and informed customers and partners that they have two weeks to migrate their data to another cloud provider, the company has given them a slight reprieve: Now they have until Oct. 15, a customer support rep at the company told me Thursday.

Customers will no longer be permitted to upload any data as of next Monday, Sept. 23, but the customer service rep agreed with me that one might not want to upload another bit at this point.

More than 1,000 enterprise customers have data stored in the Nirvanix cloud service, according to Forrester Research analyst Henry Baltazar, who said in a blog post Wednesday that the provider's demise validates warnings that it's easier to upload to the cloud than to recover large quantities of data.

"The recent example with Nirvanix highlights why customers should also consider exit and migration strategies as they formulate their cloud storage deployments," Baltazar said. "One of the most significant challenges in cloud storage is related to how difficult it is to move large amounts of data from a cloud. While bandwidth has increased significantly over the years, even over large network links it could take days or even weeks to retrieve terabytes or petabytes of data from a cloud. For example, on a 1 Gbps link, it would take close to 13 days to retrieve 150 TB of data from a cloud storage service over a WAN link."

Gartner analyst Kyle Hilgendorf also emphasized in a blog post that failure to have an exit strategy when using a cloud service, especially for storage, can be a recipe for disaster. As for this week's Nirvanix news, Hilgendorf said: "What are clients do to?  For most -- react...and react in panic. You may be facing the worst company fear -- losing actual data."

With everyone trying to migrate their data simultaneously and Nirvanix's limited network resources, many customers are finding themselves in a jam, said Andres Rodriguez, founder and CEO of Nasuni, which provides its own storage service that once used Nirvanix as its back-end target.

"What's happening now with Nirvanix is the equivalent of bank rush," Rodriguez said. "Everyone is trying to get their data out in a hurry and you know what that does to a network, and it's going to be very hard to get their data out."

Rodriguez said he has warned his customers for many months that he believed Nirvanix was at risk of going out of business. Rodriguez said when Nasuni used Nirvanix as its cloud storage provider two years ago, he became increasingly concerned it couldn't scale. Nasuni now uses Amazon Web Services Simple Storage Service (S3) for primary storage. 

Nasuni runs annual tests against what Rodriguez believes are the largest cloud providers. The most recent test results released earlier this year concluded Amazon S3 and Windows Azure were the only two viable enterprise-grade storage services.

Nasuni just added a mirroring option that lets customers replicate their data stored in Amazon S3 to Windows Azure for added contingency. While Rodriguez believes Amazon S3 and Windows Azure are the most scalable and resilient, he warns it could be years before the majority of customers feel comfortable using the latter as their primary target.

Posted by Jeffrey Schwartz on 09/19/2013 at 3:49 PM2 comments


Subscribe on YouTube