Cisco, Microsoft Strengthen Cloud Pact

Eight months ago, Satya Nadella -- before he was named Microsoft CEO -- announced a cloud-based partnership with Cisco: "Microsoft and Cisco’s Application Centric Infrastructure, hello cloud!"

"Through this expanded partnership, we will bring together Microsoft’s Cloud OS and Cisco’s Application Centric Infrastructure (ACI) to deliver new integrated solutions that help customers take the next step on their cloud computing journey," Nadella wrote at the time.

Well, the companies took another step on their own journey yesterday with the networking giant's announcement that "Cisco Doubles Down on Datacenter with Microsoft."

But the language stayed pretty much the same.

"By providing our customers and partners with greater sales alignment and even deeper technology integration, we will help them transform their datacenters and accelerate the journey to the cloud," said Cisco exec Frank Palumbo.

The three-year agreement, announced at the Microsoft Worldwide Partner Conference in Washington, D.C., will see the companies investing in combined sales, marketing and engineering resources.

Cisco announced integrated products and services that will focus on the Microsoft private cloud, migration of servers, services providers and SQL Server 2014.

"Cisco and Microsoft sales teams will work together on cloud and datacenter opportunities, including an initial program focused on the migration of Windows 2003 customers to Windows 2012 R2 on the Cisco UCS platform," Cisco said.

A bevy of other offerings to be integrated includes Cisco Nexus switching, Cisco UCS Manager with System Center integration modules, Cisco PowerTool, Windows Server 2012 R2, System Center 2012 R2, Windows PowerShell and Microsoft Azure. Future releases will integrate Cisco ACI and Cisco InterCloud Fabric.

Further details were offered by Cisco's Jim McHugh. "At Cisco we believe our foundational technologies -- with UCS as the compute platform, Nexus as the switching platform, and with UCS Manager and System Center management integration -- provide customers an optimal infrastructure for their Microsoft Windows Server workloads of SQL, SharePoint, Exchange and Cloud."

Microsoft's Stephen Boyle also weighed in on the pact. "Enterprise customers worldwide are betting on Microsoft and Cisco to realize the benefits of our combined cloud and datacenter technologies," Boyle said. "Now, together, we're strengthening our joint efforts to help customers move faster, reduce costs and deliver powerful new applications and services for their businesses."

And now, eight months after that first ACI announcement, you can expect the companies' joint journey along with their customers to the cloud to progress even further.

"This greater alignment between Cisco and Microsoft will set the stage for deeper technology integration," said Cisco's Denny Trevett. "Together we will deliver an expanded portfolio of integrated solutions to help customers modernize their datacenters and accelerate their transition to the cloud."

So get ready for your trip, because you're going.

Posted by David Ramel on 07/16/2014 at 12:47 PM0 comments


Self-Service Cloud Tackles IT/Developer Divide

RightScale Inc. announced a new self-service portal that lets enterprises set up cloud services brokerages to provide developers and other users with instant access to cloud infrastructure.

Using RightScale Self-Service, IT departments can define a curated catalog of different resources that can be used for scenarios such as application development, testing, staging and production; demonstrations by sales and others; training; batch processing; and digital marketing, the company said.

RightScale Self-Service allows users to automatically provision instances, stacks or complex multi-tier applications from a catalog defined by IT.

Such catalogs can feature "commonly used components, stacks, and applications and an easy-to-use interface for developers and other cloud users to request, provision, and manage these services across public cloud, private cloud, and virtualized environments," said RightScale's Andre Theus in a blog post.

The company said the self-service portal can help companies bridge the divide between developers and central IT departments and control "shadow IT" issues. These problems occur because the ability to provide infrastructure and platform services has lagged behind development practices such as Agile, continuous delivery and continuous integration.

The RightScale ecosystem
[Click on image for larger view.] The RightScale Ecoystem
(source: RightScale Inc.)

Developers and other users might have wait for weeks or months for such resources to be provided, RightScale said. This creates a divide between IT and these staffers. When users get impatient and use public cloud services to instantly gain access to such requested infrastructure and platform resources themselves, a "shadow IT" problem can develop, with IT having no control or visibility into these outside services being used.

"By offering streamlined access to infrastructure with a self-service portal, IT can now meet the business needs of their organizations," the company said in a white paper (registration required to download). "At the same time, a centralized portal with a curated catalog enables IT to create a sanctioned, governed, and managed process for cloud use across both internal and external resource pools.

"Cloud architects and security and IT teams can implement technology and security standards; control access to clouds and resources; manage and control costs; and ensure complete visibility and audit trails across all their cloud deployments."

As quoted by the company, Redmonk analyst Stephen O'Grady agreed. "Developers love the frictionless availability of cloud, but enterprises crave visibility into their infrastructure, which is challenged by the widespread cloud adoption," O'Grady said. "RightScale Self-Service is intended to serve as a way to provide both parties what they need from the cloud."

The RightScale Self-Service catalog
[Click on image for larger view.] The RightScale Self-Service Catalog
(source: RightScale Inc.)

What the parties need, RightScale said, are capabilities in four areas: standardization and abstraction; automation and orchestration; governance and policies; and cost management.

Under standardization and abstraction, users can design the catalog applications to meet company standards regarding security, versioning, software configuration and so on.

Under automation and orchestration, users can define workflows to provision multi-tier apps, take snapshots and roll back to them if needed, and integrate with IT service management systems or continuous integration services. While the orchestration capabilities are enabled through a cloud-focused workflow language that lets users automate ordered steps and use RESTful APIs to integrate with other services and systems, RightScale also features a public API to facilitate its integration with other systems.

Under governance and policies, the portal provides policy-based management of infrastructure and resources. IT teams can segregate responsibilities by role. For example, different rules can be set for users who are allowed to define catalog items and for others who only have permission to access and launch them.

Under cost management, the portal can display hourly costs and total costs of services, and IT teams can set usage quotas to keep projects under budget.

RightScale Self-Service also integrates with the company's Cloud Analytics product, so companies can do "what if" analysis on various deployments, clouds and purchase options -- for example, to measure on-demand cost vs. pre-purchased cost. It also integrates with the company's Cloud Management product to administer applications.

In addition to providing on-demand access to cloud resources and offering the technology catalog, the self-service portal lets companies support major private and public clouds such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform, OpenStack and VMware vSphere environments.

Regarding the latter, Dan Twing, president and COO of analyst firm Enterprise Management Associates, said, "Treating VMware vSphere like a cloud and providing a governance framework for enterprise cloud usage is a simple, powerful concept that will have deep impact on how enterprises innovate using the full power of cloud computing."

Posted by David Ramel on 07/16/2014 at 8:32 AM0 comments


Big Data Storms the Cloud

It's a perfect fit and it isn't a new trend, but Big Data's migration to the cloud seems to be accelerating recently.

The advantage of running your Big Data analytics in the cloud rather than on-premises -- especially for smaller companies with constrained resources -- are numerous and well-known. Oracle Corp. summed up some of the major business drivers in an article titled "Trends in Cloud Computing: Big Data's New Home":

  • Cost reduction
  • Reduced overhead
  • Rapid provisioning/time to market
  • Flexibility/scalability

"Cloud computing provides enterprises cost-effective, flexible access to Big Data's enormous magnitudes of information," Oracle stated. "Big Data on the cloud generates vast amounts of on-demand computing resources that comprehend best practice analytics. Both technologies will continue to evolve and congregate in the future."

In fact, they will evolve to the tune of a $69 billion private cloud storage market by 2018, predicted Technology Business Research. That's why the Big Data migration to the cloud is picking up pace recently -- everybody wants a piece of the multi-billion-dollar pie.

As Infochips predicted early last year: "Cloud will become a large part of Big Data deployment -- established by a new cloud ecosystem."

Infochips illustrates the trend.
[Click on image for larger view.] Infochips illustrates the trend.
(source: Infochips)

The following moves by industry heavyweights in just the past few weeks show how that ecosystem is shaping up:

  • IBM last week added a new Big Data service, IBM Navigator on Cloud, to its IBM Cloud marketplace. With a reported 2.5 billion gigabytes of data being generated every day, IBM said the new Big Data service will help organizations more easily secure, access and manage data content from anywhere and on any device.

    "Using this new service will allow knowledge workers to do their jobs more effectively and collaboratively by synchronizing and making the content they need available on any browser, desktop and mobile device they use every day, and to apply it in the context of key business processes," the company said.

    The new service joined other recent Big Data initiatives by IBM, such as IBM Concert, which offers mobile, cloud-based, Big Data analytics.

  • Google Inc. last month announced Google Cloud Dataflow, "a fully managed service for creating data pipelines that ingest, transform and analyze data in both batch and streaming modes."

    The new service is a successor to MapReduce, a programming paradigm and associated implementation created by Google that was a core component of the original Hadoop ecoystem that was limited to batch processing and came under increasing criticism as Big Data tools became more sophisticated.

    "Cloud Dataflow makes it easy for you to get actionable insights from your data while lowering operational costs without the hassles of deploying, maintaining or scaling infrastructure," Google said. "You can use Cloud Dataflow for use cases like ETL, batch data processing and streaming analytics, and it will automatically optimize, deploy and manage the code and resources required."

  • EMC Corp. on Tuesday acquired TwinStrata, a Big Data cloud storage company. The acquisition gives traditional storage EMC access to TwinStrata's CloudArray cloud-integrated storage technology.

    That was just one of a recent spate of moves to help EMC remain competitive in the new world of cloud-based Big Data. For example, when the company announced an upgrade of its VMAX suite of data storage products for big companies, The Wall Street Journal reported: "Facing Pressure from Cloud, EMC Turns Data Storage into Service."

    The same day, EMC announced "a major upgrade to EMC Isilon OneFS, new Isilon platforms and new solutions that reinforce the industry's first enterprise-grade, scale-out Data Lake." But wait, there's more: EMC also yesterday announced "significant new product releases across its Flash, enterprise storage and Scale-Out NAS portfolios" to help organizations "accelerate their journey to the hybrid cloud."

EMC's plethora of Big Data/cloud announcements make it clear where the company is placing its bets. As financial site Seeking Alpha reported: "EMC Corporation: Big Data and Cloud Computing Are the Future."

That was in March, and the future is now.

Posted by David Ramel on 07/10/2014 at 2:37 PM0 comments


Red Hat Touts VMware Integration in OpenStack Release

Enterprise Linux giant Red Hat is pushing further into the OpenStack arena with the release of Red Hat Enterprise Linux OpenStack Platform 5.

The company touts integration with VMware infrastructure as a key selling point of its distribution of the OpenStack cloud OS/platform. That's just one of "hundreds" of reported improvements to the OpenStack core components: compute, storage and networking. Other enhancements include three-year support and new features such as configuration and installation enhancements designed to help enterprises adopt OpenStack technology.

OpenStack is an open source project originally created by Rackspace Hosting and NASA, now under the direction of a consortium of industry heavyweights, including Red Hat. In fact, Red Hat -- which says it's "all in on OpenStack" -- has been the No. 1 contributor to recent OpenStack distributions, the last being Icehouse, released in April. As with Linux, the vendor makes money by bundling its distribution with services, support and extra features realized as part of the company's development model of "participate, stabilize, deliver."

Red Hat leads individual contributors to the open source OpenStack project.
[Click on image for larger view.] Red Hat leads individual contributors to the open source OpenStack project.
(source: OpenStack.org)

The distribution -- which went into beta in May -- is based on Red Hat Enterprise Linux 7, just released a couple months ago. Its improved support of VMware infrastructure touches on virtualization, management, networking and storage functionalities.

"Customers may use existing VMware vSphere resources as virtualization drivers for OpenStack Compute (Nova) nodes, managed in a seamless manner from the OpenStack Dashboard (Horizon)," Red Hat said in an announcement. The new product also "supports the VMware NSX plugin for OpenStack Networking (Neutron) and the VMware Virtual Machine Disk (VMDK) plugin for OpenStack Block Storage (Cinder)."

Red Hat's OpenStack offering -- aimed at advanced cloud users such as telecommunications companies, Internet service providers and cloud hosting providers -- comes with a three-year support lifecycle, backed by a partner ecosystem with more than 250 members.

New features include better workload placement via server groups. Companies can choose to emphasize the resiliency of distributed apps by spreading them across cloud resources, or ensure lower communications latency and improved performance by locating them closer.

Red Hat also said the interoperability of heterogeneous networking stacks is enhanced through the new Neutron modular plugin architecture. That lets enterprises more easily add new components to their OpenStack deployments and provide better mix-and-match networking solutions.

A more technical enhancement is the use of the "para-virtualized random number generator device" that came with Red Hat Enterprise Linux 7, so guest applications can use better encryption and meet new cryptographic security requirements.

Red Hat said that in the coming weeks it will make generally available an OpenStack distribution based on its previous Linux distribution, version 6. The company will maintain dual versions so customers can best complement the OS they're using.

According to an IDG Connect survey commissioned last year by Red Hat, that overall customer base is expected to grow. The survey said 84 percent of respondents indicated they had plans to adopt OpenStack technology.

Posted by David Ramel on 07/09/2014 at 1:49 PM0 comments


When Cloud Projects Fail

Enterprises moving to the cloud is a given, most will agree, but new research suggests it's a bumpy ride, with "staggering" project failure rates.

Even among the most well-known Infrastructure as a Service (IaaS) providers, unexpected challenges derail many cloud implementations. These challenges include complicated pricing and hidden costs that erase expected cost savings, performance problems and more, according to the report: "Casualties of Cloud Wars: Customers Are Paying the Price."

The research was conducted by Enterprise Management Associates Inc. (EMA) and commissioned by cloud infrastructure providers Iland and VMware Inc. EMA recently surveyed more than 400 professionals around the world to gauge their IaaS implementation experiences with major vendors such as Amazon.com Inc., Microsoft and Rackspace.

[Click on image for larger view.] We need help! Respondents report what type of external support they require when operating a public cloud.
(source: iland)

"The promise of cloud remains tantalizingly out of reach, and in its place are technical headaches, pricing challenges and unexpectedly high operational costs," the report stated. Iland, in a statement announcing the study, said, "Respondents reported staggering failure rates across 'tech giant' IaaS implementations and identified the support and functionality needed to overcome top challenges."

As part of the survey, respondents were asked which public cloud services they considered, which were adopted, and which stalled or failed.

Rackspace was the leader in stalled/failure rates, named by 63 percent of respondents, followed by Amazon (57 percent), Microsoft Azure (44 percent) and, finally, VMware vCloud-based service providers, scoring the "best" with a 33 percent reported failure rate. Iland is a VMware-based cloud provider (and remember, Iland and VMware commissioned the study).

"Eighty-eight percent of respondents experienced at least one unexpected challenge," the survey said. "The most common challenge in the United States was support, while performance and downtime topped [Europe] and [Asia-Pacific] lists, respectively.

The unexpected challenges were primarily found in six areas (with the percentage of respondents listing each):
  • Pricing -- 38 percent
  • Performance -- 38 percent
  • Support -- 36 percent
  • Downtime -- 35 percent
  • Management of cloud services -- 33 percent
  • Scalability -- 33 percent

On pricing, the report stated: "While cost savings may be a key benefit of cloud, the current pricing models under which they operate are difficult to understand. A cursory glance at the pricing schemes of major public cloud providers demonstrates the need for the customer to carefully analyze pricing models and their own IT needs before committing to an option."

On performance, the report stated: "Different clouds are architected with different back-ends, and some are more susceptible to 'noisy neighbor' syndrome than others. For customers who are sensitive to variations in performance, this can impact the cloud experience."

Moving from challenges to solutions, respondents were asked what factors would make cloud services more accessible to their organizations. The responses were:

  • Better management dashboard -- 52 percent
  • More flexible virtual machine (VM) scaling -- 47 percent
  • Easier resource scalability -- 48 percent
  • Better VMware vSphere integration -- 45 percent
  • More transparent pricing -- 43 percent
  • Simpler onboarding -- 37 percent
  • Certainty of geographic location of workloads -- 35 percent

"Finally, respondents were nearly unanimous in their agreement that high-quality, highly available phone-based support was critical to their cloud implementations," the study said. "This is notable, in part, because phone-based support is far from standard among public cloud companies, and more often than not, is accompanied by a high-cost support contract."

Despite the problems, the study revealed that almost 60 percent of respondents indicated an interest in adding cloud vendors. In contrast, fewer than 20 percent plan to quit the cloud because of security, cost, compliance or complexity issues.

Top factors influencing cloud adoption were listed as disaster recovery capabilities, cost, rapid scalability and deployment speed.

"Companies cannot afford to turn their backs on cloud computing, as it represents a key tool in the race for innovation across industries and around the world," the report stated in its summary.

Posted by David Ramel on 06/25/2014 at 12:49 PM0 comments


Some Cheerful Truths About Cloud Storage

Yes, Jon William Toigo makes some excellent points about lack of cloud storage management tools in his recent post, "Some Depressing but Hard Truths About Cloud Storage," but there has been some good news lately: dropping cloud storage prices.

Sure, the ongoing cloud storage price wars among companies such as Google Inc., Apple Inc. and Microsoft might be consumer-oriented, but cheaper cloud storage for the public at large surely translates into lower costs for enterprises.

The latest volley came this week from Microsoft, which increased the amount of free storage on its OneDrive (formerly SkyDrive) service and slashed prices for paid storage. The move follows similar initiatives from competitors.

In March, Google cut prices for its Google Drive service.

Shortly after, Amazon also got in on the act, posting lower prices effective in April.

Enterprise-oriented Box at about the same time decreased the cost of some of its services, such as its Content API.

Earlier this month, Apple announced price reductions for its iCloud storage options and previewed a new iCloud Drive service coming later this year with OS upgrades.

Just a couple weeks ago, IBM downgraded the expense of object storage in its SoftLayer cloud platform.

Interestingly, "Dropbox refuses to follow Amazon and Google by dropping prices," CloudPro reported recently. Coincidentally, The Motley Fool today opined that "Microsoft May Have Just Killed Dropbox."

In addition to price wars, the cloud storage rivals are also improving other aspects of their services in order to remain competitive. For example, in the Microsoft announcement a couple days ago, the company also increased the free storage limit from 7GB to 15GB. And if you subscribe to Office 365, you get a whopping 1TB of free storage. In paid storage, the new monthly prices will be $1.99 for 100GB (previously $7.49) and $3.99 for 200GB (previously $11.49), the company said.

And, of course, Amazon and Google both last week announced dueling solid-state drive (SSD) offerings.

So as Jon William Toigo ably explained, enterprise cloud storage management is a headache and it's not getting enough attention. But if these consumer trends continue apace in the enterprise arena, lower TCO and increased ROI should lessen the pain.

Posted by David Ramel on 06/25/2014 at 12:59 PM0 comments


SSD Cloud Storage Latest Tool in Amazon vs. Google

Amid a battle for cloud supremacy involving alternating price cuts and feature introductions, both Amazon and Google introduced solid-state drive (SSD) storage options for their services this week.

Amazon Web Services (AWS) yesterday announced an SSD-backed storage option for its Amazon Elastic Block Store (Amazon EBS), just one day after Google announced a new SSD persistent disk product for its Google Cloud Platform.

The moves are just the latest in a continuing struggle, as illustrated by research released earlier this month by analyst firm Gartner Inc. showing perennial cloud leader Amazon was being chased by Google and Microsoft in the Infrastructure as a Service (IaaS) market.

In fact, this week's alternating SSD announcements mirrored a surprisingly similar scenario in March, when Google lowered its cloud service prices and AWS followed up the very next day with its own price reduction. (Here's a price comparison conducted by RightScale after the moves, if you're interested.)

One difference in this week's sparring is that Google positioned its SSD product as a high-performance option, while the AWS offering targets general-purpose use, as it already had an existing high-performance service.

Google said its persistent disk product was a response to customers asking for a high input/output operations per second (IOPS) solution for specific use cases. The SSD persistent disks are now in a limited preview (you can apply here) with a default 1TB quota. SSD pricing is $0.325 per gigabyte per month, while standard persistent disk storage (the old-fashioned kind with spinning plates) costs $0.04 per gigabyte per month.

"Compared to standard persistent disks, SSD persistent disks are more expensive per GB and per MB/s of throughput, but are far less expensive per IOPS," Google said. "So, it is best to use standard persistent disks where the limiting factor is space or streaming throughput, and it is best to use SSD persistent disks where the limiting factor is random IOPS." Included in the Google announcement was news of another new product: HTTP load balancing, also in limited preview.

In the AWS announcement the next day, the company introduced a General Purpose (SSD) volume type to join its existing higher-performance Provisioned IOPS (SSD) service and the lower-grade Magnetic (formerly called Standard) volumes. The storage volumes can be attached to the company's Elastic Compute Cloud (EC2) instances.

"The General Purpose (SSD) volumes introduced today are designed to support the vast majority of persistent storage workloads and are the new default Amazon EBS volume," AWS said. "Provisioned IOPS (SSD) volumes are designed for I/O-intensive applications such as large relational or NoSQL databases where performance consistency and low latency are critical."

AWS said the new offering was designed for "five nines" availability and can burst up to 3,000 IOPS, targeting a variety of workloads including personal productivity, small or midsize databases, test/development environments, and boot volumes.

The new AWS EBS product costs $0.10 per gigabyte per month at its Virginia and Oregon datacenters, while the higher-performance Provisioned IOPS (SSD) service costs $0.125 per gigabyte per month. Note that the companies' pricing schemes vary in some details, so listed prices here aren't necessarily directly comparable.

I don't know how AWS has been able to counter its challenger's announcements within 24 hours of release, but I can't wait to see what's next.

Posted by David Ramel on 06/18/2014 at 12:19 PM0 comments


Cloud Survey: Amazon AWS, VMware Lead in Public, Private Arenas

New research suggests Amazon AWS is the platform of choice for public clouds while VMware is being chased by open source contender OpenStack in the private cloud arena.

Commissioned by Database-as-a-Service (DBaaS) start-up Tesora, the "Database Usage in the Public and Private Cloud: Choices and Preferences" survey garnered more than 500 responses from North American open source developer communities.

"OpenStack is catching up to VMware as the preferred private cloud platform even though it has only been around for a few years," Tesora said in an accompanying statement today. "Of organizations that are using a private cloud, more than one-third now use OpenStack."

Tesora says OpenStack gaining on VMware
[Click on image for larger view.] Tesora says OpenStack deployments are gaining on VMware
(source: Tesora)

The database service being developed by Tesora is based on the open source Trove DBaaS project introduced in the April "Icehouse" release of the open source OpenStack cloud platform. Tesora -- formerly called ParElastic -- launched in February and last week open sourced its database virtualization engine (DVE). It now offers a free community edition and a supported enterprise edition of DVE. The company states it's also developing what it claims to be the first enterprise-class, scalable DBaaS platform for OpenStack.

In the Tesora-commissioned survey, 15 percent of respondents reported that VMware vCloud Director was being used as a private cloud infrastructure at their organizations, followed by OpenStack at 11 percent, CloudStack at 3 percent and Eucalyptus at 1 percent. Tesora said that of all organizations using a private cloud, more than one-third use OpenStack.

A report caveat noted: "This is a survey of open-source software developers, and surveys of other groups will have different results. For example, a report by 451 Research, 'The OpenStack Tipping Point -- Will It Go Over the Edge?' (May 2014) shows a somewhat wider gap between VMWare and OpenStack, while a 2013 IDG Connect survey (funded by Red Hat) found that fully 84 percent of IT decision-makers plan on implementing OpenStack at some point."

Tesora noted that OpenStack's most recent survey of its own members, taken last month at an Atlanta conference, found more than 500 OpenStack clouds in production.

451 Research says VMware's lead over OpenStack is larger
[Click on image for larger view.] 451 Research shows VMware's lead over OpenStack is larger
(source: 451 Research)
IDC report shows wider planned OpenStack adoption
[Click on image for larger view.] IDG Connect report shows wider planned OpenStack adoption
(source: Red Hat)
IDC report shows wider planned OpenStack adoption
[Click on image for larger view.] OpenStack's user survey deployment chart
(source: OpenStack)

In the public cloud arena, Amazon AWS was reported as the infrastructure being used the most, garnering 24 percent of responses, followed by Google Compute Engine (GCE) at 16 percent, Microsoft Azure at 8 percent, Rackspace at 6 percent and IBM Softlayer at 4 percent. Tesora said that while Amazon AWS coming in at No. 1 was not surprising, "what is a surprise is that Google GCE was fairly close behind at 16 percent even though it has only been generally available since December 2013."

Tesora said another expected result in the survey findings was that Microsoft SQL Server was the No. 1 database in use for public and private clouds, mentioned by 57 percent of respondents. MySQL was second at 40 percent, followed by Oracle, 38 percent, and MongoDB (the most popular NoSQL choice), at 10 percent.

For both public and private clouds, Web services was the most-reported workload, followed by quality assurance and databases. Cost savings was the primary reason given for implementing a private cloud, followed closely by operational efficiencies and integration with existing systems.

"While more than half of all respondents indicated that they are likely or very likely to use their company’s DBaaS on a private cloud, 31 percent plan on implementing or have implemented DBaaS in a private cloud," noted Tesora in reporting on further survey findings. "Of the people who are interested in DBaaS, nearly all of them were looking for relational DBaaS. Roughly a third (34 percent) of those people were interested in NoSQL DBaaS."

The survey report also made the following statements:

  • The results suggest that relational databases still dominate despite rapid adoption of NoSQL solutions by high-profile enterprises like Twitter and Facebook.
  • Only 5 percent of respondents were doing any kind of Big Data/data mining/Hadoop workloads, suggesting that the marketing hype around 'Big Data' may be ahead of reality.
  • The results also indicate how mainstream cloud computing has become, with more than half of respondents reporting that their organizations are using public clouds, while 49 percent are using private clouds.

Tesora noted that respondents were early adopters of new technology and the survey provided a snapshot of OpenStack and DBaaS in early stages of development. "It's encouraging to see the traction of OpenStack in this early adopter segment of the private cloud market," said Ken Rugg, CEO of Tesora. "These findings are important because they are leading indicators of the kinds of technology and architecture decisions we can expect to see as private cloud adoption explodes and OpenStack matures and more vendors and customers go down this open source path."

Tesora used SurveyMonkey to conduct the survey (available for download after registration), while The Linux Foundation, MongoDB and Percona helped distribute it.

Posted by David Ramel on 06/18/2014 at 8:36 AM0 comments


HP Piles On Cisco in New SDN Announcements

Amid vendor squabbling among companies such as VMware and Cisco about competing next-generation data center technologies, HP was up for a little Cisco-bashing of its own when it announced new cloud computing and software-defined networking (SDN) products at this week's HP Discover conference in Las Vegas.

Featured in the many new offerings were: the HP Virtual Cloud Networking SDN Application, an open standards-based network virtualization solution; the HP Helion Self-Service HPC, a private cloud product running on HP's OpenStack-based cloud platform; and the HP FlexFabric 7900 switch series, which supports the OpenFlow standard managed by the Open Networking Foundation (ONF) and the Virtual Extensible LAN (VXLAN) specification originally created by VMware, Arista Networks and Cisco.

In describing various new products, HP exec Kash Shaikh led off a blog post with a reference to Gartner analyst Mark Fabbi's recent disparaging remarks about Cisco's datacenter networking strategy, wherein he said "a reactive vendor isn't a leader."

"We couldn't agree more," said Shaikh, senior director of product and technical marketing. "We announced our complete SDN strategy almost two years back and we have been leading and consistently delivering on this strategy to take you on a journey that doesn’t require forklift upgrades."

HP's technology comparison
[Click on image for larger view.] HP's Technology Comparison (source: Hewlett-Packard Co.)

In case there was any doubt about Shaikh's target, he included a product comparison chart that indicates Cisco's solution requires: "fork-lift upgrade, additional hardware cost ACI/Nexus 9000." He also noted HP "doesn't require specific switch hardware for its cloud-based SDN solution."

Networking powerhouse Cisco has been criticized for an inconsistent approach to the advent of SDN technology, eventually developing its own flavor of the same -- Application Centric Infrastructure (ACI) -- with proprietary components such as Nexus 9000 switches.

Of course, Cisco has been a willing participant in the back-and-forth, with CEO John Chambers recently stating the networking industry was in for a "brutal consolidation" that will see competitors such as HP fail and vowing the company was going to win back customers and "crush" competitors such as VMware.

Anyway, for those interested in speeds and feeds more than snarky sniping, HP's announcements are summarized here:

  • The HP Virtual Cloud Networking SDN application integrates with the HP Virtual Application Networks SDN controller and uses OpenFlow for dynamic deployment of policies on virtual networks such as those using the Open vSwitch implementation and physical networks from HP and others, HP said.

    "VCN [Virtual Cloud Networking] provides a multitenant network virtualization service for KVM and VMware ESX multi-hypervisor datacenter applications, offering organizations both open source as well as proprietary solutions," the company said. "Multitenant isolation is provided by centrally orchestrated VLAN or VXLAN-based virtual networks, operating over standard L2 or L3 datacenter fabrics."

    The SDN application will come out in August with the company's HP Helion OpenStack cloud platform release.

  • The HP Helion Self-Service HPC is designed to make high-performance computing (HPC) resources easier to use through an optimized private cloud.

    "This new solution provides a self-service portal that makes using HPC resources as easy as using a familiar application -- thereby making them accessible for more staff," the company said. "The solution also makes HPC accessible and manageable for organizations by allowing them the choice to manage it themselves or have HP HPC experts implement and manage it for them through a pay-for-use model that lets them stay focused on delivering products and innovation."

  • The HP FlexFabric 7900 switch series, available now with a $55,500 price tag, is described as a compact modular switch targeting virtualized datacenters that supports VXLAN, Network Virtualization using Generic Routing Encapsulation (NVGRE) and OpenFlow. HP said the switch supports open, standards-based programmability via its SDN App Store and SDK. The switch federates HP FlexFabric infrastructure via a VMware NSX virtual overlay. It provides 10GbE, 40GbE and 100GbE interfaces.

HP also made a number of other product announcements, including: new HP Apollo HPC systems; enhanced all-flash HP 3PAR StoreServ and HP StoreOnce Backup applications; an enhanced HP ConvergedSystem for Virtualization platform for IT as a Service; an HP Trusted Network Transformation service to help customers with on-the-go network upgrades; and an HP Datacenter Care Flexible Capacity pay-as-you-go model.

The company said the new offerings "enable a software-defined datacenter that is supported by cloud delivery models and built on a converged infrastructure that spans compute, storage and network technologies."

Shaikh was more pointed in his summary of the market, including one more shot at Cisco. "While some niche vendors only address certain aspects of datacenter and cloud, others are taxing customers with a series of disjointed, proprietary and expensive hardware-defined fork-lift upgrades," he said. "Meanwhile, HP has been leading in software-defined networking since its beginning in 2007 and has the most comprehensive SDN offering."

Posted by David Ramel on 06/11/2014 at 11:41 AM0 comments


New Red Hat Enterprise Linux Release Targets the Cloud

Red Hat's brand-new, major Linux OS enterprise release is all about the cloud.

The company is positioning Red Hat Enterprise Linux 7 (RHEL 7) as going beyond an old-fashioned commodity OS to be instead a next-generation "catalyst for enterprise innovation." It's a redefined tool to help companies transition to a converged datacenter encompassing bare-metal servers, X as a Service offerings and virtual machines (VMs) -- with a focus on open hybrid cloud implementations.

In case there's any doubt about the cloud emphasis, Red Hat used the term 23 times in its announcement yesterday.

"Answering the heterogeneous realities of modern enterprise IT, RHEL 7 offers a cohesive, unified foundation that enables customers to balance modern demands while reaping the benefits of computing innovation, like Linux Containers and Big Data, across physical systems, virtual machines and the cloud -- the open hybrid cloud," the company said.

The New OS
[Click on image for larger view.] The New OS (source: Red Hat Inc.)

New features in the OS -- practically a ubiquitous standard in Fortune 500 datacenters -- include "enhanced application development, delivery, portability and isolation through Linux Containers, including Docker, across physical, virtual, and cloud deployments, as well as development, test, and production environments," the company said.

XFS is the new default file system, with the capability to scale volumes to 500TB. Ext4, the previous default -- now an option -- had been limited to 16TB file systems, which in the new release has been increased to 50TB.

Also included are tools for application runtimes and development, delivery and troubleshooting.

The new release is designed to coexist in heterogeneous environments, as exemplified by cross-realm trust that allows secure access from users on Microsoft Active Directory across Windows and Linux domains.

Other features as listed by Red Hat include:

  • A new boot loader and redesigned graphical installer.
  • The kernel patching utility Technology Preview, which allows users to patch the kernel without rebooting.
  • Innovative infrastructure components such as systemd, a standard for modernizing the management of processes, services, security and other resources.
  • The Docker environment, which allows users to deploy an application as a lightweight and portable container.
  • Built-in performance profiles, tuning and instrumentation for optimized performance and scalability.
  • The OpenLMI project, a common infrastructure for the management of Linux systems.
  • Red Hat Software Collections, which provides a set of dynamic programming languages, database servers and related software.
  • Enhanced application isolation and security enacted through containerization to protect against both unintentional interference and malicious attacks.

A new RHEL edition to be released later -- Atomic Host -- was announced during a virtual keynote presentation (registration required to view on-demand) by Red Hat executive Jim Totton. "It's a lightweight version of the same Red Hat Enterprise Linux operating system, tuned to run in container environments," he said. "So this is a lightweight version of RHEL, but it's the same RHEL as we've been talking about."

Totton also announced Red Hat Enterprise virtualization product technology, where a hypervisor is engineered as part of the RHEL kernel. He said it can provide the virtual datacenter fabric for hosting RHEL as a guest. Totten also highlighted the Red Hat Enterprise Linux OpenStack platform "for really deploying the flexible and agile private cloud and public cloud type of environments."

RHEL 7 got a thumbs-up from Forrester Research analyst Richard Fichera, who wrote that it "continues the progress of the Linux community toward an OS that is fully capable of replacing proprietary RISC/Unix for the vast majority of enterprise workloads. It is apparent, both from the details on RHEL 7 and from perusing the documentation on other distribution providers, that Linux has continued to mature nicely as both a foundation for large scale-out clouds, as well as a strong contender for the kind of enterprise workloads that previously were only comfortable on either RISC/Unix systems or large Microsoft Server systems. In effect, Linux has continued its maturation to the point where its feature set and scalability begin to look like and feel like a top-tier Unix."

Oh, it also works in that cloud thing.

Posted by David Ramel on 06/11/2014 at 11:10 AM0 comments


Study: Cloud Increases Potentially Costly Security Risks

A new report shows IT pros believe using cloud services increases the risk of data breaches that -- with an estimated cost of about $201 per compromised record -- can easily cost victimized companies millions of dollars.

Furthermore, IT and security staffers believe they have little knowledge of the scope of cloud services in use at their companies and are unsure of who is responsible for securing these services, according to a study, "Data Breach: The Cloud Multiplier Effect," conducted by Ponemon Institute LLC for Netskope, a cloud application analytics and policy enforcement company.

Netskope yesterday released the reported first-ever research conducted to estimate the cost of a data breach in the cloud, gathering information from a survey of 613 IT and security professionals. They were asked to estimate the probability of a data breach involving 100,000 or more customer records at their organizations under current circumstances and how the increased use of cloud services would affect that probability. The conclusion, Netskope said, is that increased cloud use could triple the risk of such a data breach.

IT and security pros show little trust in their organizations' security practices.
[Click on image for larger view.] IT and security pros show little trust in their organizations' security practices.
(source: Netskope)

And that breach could be quite expensive for a company, according to the study, which extrapolated the potential cost by using data from a previous Ponemon Institute report released last month, "Cost of a Data Breach." It pegged the cost of each lost or stolen customer record at about $201, meaning a 100,000-customer breach could cost more than $20 million.

"Imagine then if the probability of that data breach were to triple simply because you increased your use of the cloud," said Sanjay Beri, CEO and founder of Netskope. "That's what enterprise IT folks are coming to grips with, and they've started to recognize the need to align their security programs to account for it. The report shows that while there are many enterprise-ready apps available today, the uncertainty from risky apps is stealing the show for IT and security professionals. Rewriting this story requires contextual knowledge about how these apps are being used and an effective way of mitigating risk." That, of course, is what Netskope seeks to sell to customers.

Several factors contribute to the general perception that an organization's high-value intellectual property and customer data are at higher risk in today's typical corporate environment when the use of cloud services increases. For example, respondents said their networks are running cloud services unknown to them, they aren't familiar with cloud service provider security practices and they believe their companies don't pay enough attention to the implementation and monitoring of security programs.

"Cloud security is an oxymoron for many companies," the report stated. "Sixty-two percent of respondents do not agree or are unsure that cloud services are thoroughly vetted before deployment. Sixty-nine percent believe there is a failure to be proactive in assessing information that is too sensitive to be stored in the cloud."

Also worrisome is the percentage of business-critical applications entrusted to the cloud. Survey respondents estimated that about 36 percent of such apps are cloud-based, yet nearly half of them are invisible to IT. Respondents also said 45 percent of all applications are based in the cloud, but IT lacks visibility into half of them.

Similar cloud-related security research conducted earlier by Ponemon Institute indicated that 53 percent of organizations were trusting cloud providers with sensitive data. Only 11 percent of respondents in that survey said they didn't plan on using cloud services in the next couple of years.

According to yesterday's report, respondents also don't seem to trust their cloud service providers much. "Almost three-quarters (72 percent) of respondents believe their cloud service provider would not notify them immediately if they had a data breach involving the loss or theft of their intellectual property or business confidential information, and 71 percent believe they would not receive immediate notification following a breach involving the loss or theft of customer data," the report stated.

Netskope listed the following highlights of the study:

  • Respondents estimate that every 1 percent increase in the use of cloud services will result in a 3 percent higher probability of a data breach. This means that an organization using 100 cloud services would only need to add 25 more to increase the likelihood of a data breach by 75 percent.
  • More than two-thirds (69 percent) of respondents believe that their organization is not proactive in assessing information that is too sensitive to be stored in the cloud.
  • 62 percent of respondents believe the cloud services in use by their organization are not thoroughly vetted for security before deployment.

"We've been tracking the cost of a data breach for years but have never had the opportunity to look at the potential risks and economic impact that might come from cloud in particular," said Dr. Larry Ponemon, chairman and founder of Ponemon Institute. "It's fascinating that the perceived risk and economic impact is so high when it comes to cloud app usage. We'll be interested to see how these perceptions change over time as the challenge becomes more openly discussed and cloud access and security broker solutions like Netskope become more known to enterprises."

The 26-page report is available in a PDF download after free registration.

Posted by David Ramel on 06/05/2014 at 8:58 AM0 comments


Gartner: Cloud Leader Amazon Chased by Microsoft and Google

For the fifth straight year, Amazon Web Services (AWS) was ranked as the top cloud Infrastructure as a Service (IaaS) vendor in the "magic quadrant" report released by research firm Gartner Inc.

However, it's being chased by Microsoft -- new to the party but coming on strong -- and Google, which made the report for the first time this year.

Gartner's magic quadrant report uses exhaustive research to place vendors in a four-square chart with axes based on "ability to execute" and "completeness of vision." The top-right square contains "leaders" based on cumulative scores. The bottom-left vendors are at the opposite end, "niche players." Top-left is "challengers" and bottom-right is "visionaries." Figure 1 shows the results of the May 2014 report.

Gartner 2014 Magic Quadrant IaaS report
[Click on image for larger view.] Figure 1. Gartner 2014 Magic Quadrant IaaS report
(source: Gartner Inc.)

AWS was again at the top in the leaders square, joined by Microsoft. Last year (in an August report), CSC was the only other company in the leaders square. This year, it changed places with Microsoft, which moved up from the visionaries quadrant. Google debuted on the chart in the No. 3 slot on the vision axis, though it was ranked eighth on the execution axis. Although Google has been offering its App Engine service since 2008, Gartner said it didn't enter the IaaS arena until the December 2013 launch of its Google Compute Engine.

Likewise, Microsoft has been in the cloud business for some time with its Azure service, but the company was considered by Gartner to be in the Platform as a Service (PaaS) market until the April 2013 debut of Azure Infrastructure Services.

VMware was the second vendor to make the chart for the first time in the latest report. The only companies dropped were due to acquisitions: SoftLayer (now IBM); and Saavis and Tier 3 (now both CenturyLink).

The continued dominance of AWS was evident in Gartner's assessment of its strengths. Enjoying a diverse customer base and wide range of use cases, Gartner said AWS "is the overwhelming market share leader, with more than five times the cloud IaaS compute capacity in use than the aggregate total of the other 14 providers in this Magic Quadrant. It is a thought leader; it is extraordinarily innovative, exceptionally agile, and very responsive to the market. It has the richest array of IaaS features and PaaS-like capabilities, and continues to rapidly expand its service offerings. It is the provider most commonly chosen for strategic adoption."

However, the research firm cautioned that the market is still maturing and evolving, and things may not stay the same.

"AWS is beginning to face significant competition -- from Microsoft in the traditional business market, and from Google in the cloud-native market," Gartner said. "So far, it has responded aggressively to price drops by competitors on commodity resources. However, although it is continuously reducing its prices, it does not commodity price services where it has superior capabilities. AWS currently has a multiyear competitive advantage, but is no longer the only fast-moving, innovative, global-class provider in the market."

Here are the quadrant charts from previous years (click on the images to see larger versions):
August 2013
August 2013 (source: Gartner Inc.)
October 2012
October 2012 (source: Gartner Inc.)
December 2011
December 2011 (source: Gartner Inc.)
December 2010
December 2010 (source: Gartner Inc.)

Posted by David Ramel on 06/04/2014 at 6:59 PM0 comments


Virtualization Review

Sign up for our newsletter.

I agree to this site's Privacy Policy.