Storage: Can't live with it, can't live without it. DataCore Software, which knows a lot about storage, is trying to make it less obtrusive and expensive to live with in virtualized environments via SANsymphony-V, which works on existing data center equipment and storage devices to eliminate sluggish I/O bottlenecks that degrade the virtualization user experience.
Designed for mid-market end users and solution providers familiar with Windows Server administration, SANsymphony-V decouples virtual infrastructures from their underlying disks works through the use of adaptive caching and performance-boosting techniques and optimizing I/O responses obtained from standard storage devices that might have otherwise been replaced.
In this open environment, users can make their best deals among competitive suppliers when they need to expand storage capacity.
According to DataCore president and CEO George Teixeira, "DataCore doesn't have to rip and replace your storage infrastructure, so you save on capex and reduce your TCO over the long term." He goes on to note that the product also cuts operational costs by instituting best practices via automation and guided workflows.
In addition to accelerating the transition between physical and virtual machines by putting to full use the disk drives and data mounted on the original systems, SANsymphony-V also offers a device-independent feature set that includes disk pooling, synchronous mirroring, asynchronous remote replication, non-disruptive disk migrations, and low-impact, online snapshots. Integrated continuous data protection (CDP) and multi-site recovery and traffic-compressing replication help protect workloads and avoid disruptions.
Software licenses for a fully redundant, high-availability configuration start under $10,000 U.S.D., including annual 24x7 technical support.
DataCore also released the results of a benchmark test describing the performance of a configuration of 220 virtual desktops running on low-cost servers that cuts hardware costs to approximately $32.41 per desktop, including the storage infrastructure. The company says the low dollar figure is achieved using a configuration with dual mode, cross-mirrored high availability storage.
"This configuration uses the VSI benchmark and is based on DataCore's SANmelody software and the Microsoft Hyper-V virtualization platform. DataCore expects similar results with ESX and will publish those results when they become available," the company said.
Posted by Bruce Hoard on 01/31/2011 at 12:48 PM1 comments
In the wake of today's announcement by Citrix that it will provide engineering support to Amazon that will optimize Citrix products and Windows apps running on Amazon Web Services (AWS), the always loquacious and opinionated Citrix CTO Simon Crosby put his own spin on developments of the day, noting that AWS is building PaaS the right way (not that Microsoft isn't), VMware is out to lunch for calling AWS a "consumer cloud" (a gratuitous potshot), and Citrix customers can expect "seamless manageability for private and hosted workloads, with role-based, end-to-end management from any enterprise virtualization platform, to the cloud." (Not that Citrix would ever give them anything but the best.)
You know if Simon says this announcement is "perhaps predictably short on detail," as he did, that you're only getting the French pastry, as opposed to the boeuf bourguignon. Undeterred, he goes on to compliment AWS for adding high quality software services such as the Elastic Beanstalk and the RDS Relational Database Service for no incremental charge beyond compute and storage.
Elaborating on his PaaS kudos, Simon says AWS is hitting the mark by offering "highly sticky services that power real world applications," and Citrix looks forward to working with AWS in the future because the company is obviously serious about the enterprise cloud market.
He goes on to note, somewhat vaguely, that the two firms will jointly work toward furthering Citrix's strategic goal of delivering "open, interoperable" cloud solutions via collaboration on Windows, interoperability and the development of value-added cloud solutions.
As for Microsoft--which has a convivial customer relationship with AWS--Simon was nothing less than delighted to explain how this deal is good for Citrix's longstanding ally. As he puts it, "The Azure VM role thus far offers only an ephemeral instance model for Windows Server 2008 R2 VMs. Our collaboration with Amazon will enable Microsoft's Enterprise customers--and SPLA consumers--to confidently select AWS for the more demanding non-ephemeral workloads."
Win, win, and win.
Posted by Bruce Hoard on 01/27/2011 at 12:48 PM0 comments
Smart companies like Capgemini and VMware realize that there are times when it makes good business sense to combine efforts with other companies that offer services that are complementary to their own. So Capgemini--which specializes in technology, consulting, and outsourcing--is hooking up with virtualization major domo VMware to form a high-profile purveyor of business-centric, virtualization-based services available on a pay-as-you-go, IT-as-a-Service model foundation.
Capgemini and VMware say their new offerings are part of a "flexible, consumption-based" model that aims at cutting costs via consolidation and server virtualization. Naturally, the cloud is a major component of this equation, and the two partners are eager to project the growing number of cloud-enabling virtualized data centers that are increasingly dotting the IT landscape, as well VMware's key role in that transition.
According to the joint press release issued by the two firms, virtualization is on the uptick as many companies "overcome the challenges of investment and adopt holistic approaches to transformation, helping them to break through the 30% barrier," which we can only assume is the percentage of servers that have been virtualized to date. At any rate, that sure is a lot of fancy talk. I think it has something to do with people spending a whole bunch of money on virtualization-as-a-cash cow (VaaCC).
If you ask me, it all sounds a little squishy. For example, Capgemini claims they are prepared to offer "measureable business results" because of their time-tested, skills in managing IT infrastructures, which has enabled their clients to "transform" the delivery of their IT services. Beyond this high-flying rhetoric, there are no further, detailed descriptions of savings that customers can expect to experience.
It will be interesting to see what's going on with this alliance a year from now.
Any predictions?
Posted by Bruce Hoard on 01/24/2011 at 12:48 PM2 comments
Got a head's up from a vendor today: 2011 will be the year of virtual stall (which rhymes with VM sprawl, but is not a close relative). According to this vendor--who obviously has skin in the VM stall game--"analysts, visionaries, and vendors; they're all predicting that this is the year we see virtual stall taking a chunk out of the virtualization market.
I'm going to disagree with that based on the results of my many questions to vendors, users and other experts during innumerable interviews and briefings I have had in the past year or so. “Surely,” I have said, “the server virtualization market is going to slow down soon because everyone agrees that while only 50 percent of likely servers have virtualized, the low-hanging fruit has been picked over, and the failure of management technology to keep pace with the proliferation of virtualized data centers is starting to catch up with beleaguered users.”
“Not so,” nearly everyone has replied, before going into the sound reasons why this great--if you're VMware--market remains largely untapped. So without a bunch of reasons why, we just know that VMware, which basically owns the server virtualization game, will continue to prosper. Nobody is going to take any chunks out of their bread and butter.
So what about Citrix? If you listen to their public pronouncements, XenServer downloads are strong and the market is receptive. Microsoft? Hyper-V may be the poor man's virtualization tool of choice, and Redmond won't be capturing dominant virtualization market share anytime soon, but there are many users out there who are happy to stay in their comfort zone with what they know best, even if Hyper-V is relatively underpowered compared to ESXi.
I haven't heard of any user groundswell indicating that server virtualization buyers are beginning to balk; just the opposite, really, because more and more SMBs are virtualizing, and they tend to do that with big buys. At the same time, enterprise data centers have come a long way, but again you don't hear CIOs saying they are reaching the end of the line with their server virtualization plans. What they're seeing is a lot of upcoming cloud configurations that will require substantial, data center-based virtual server support.
This market is just too big, too lucrative to get sick at this point. Virtual stall? I don't think so.
What do you think?
Posted by Bruce Hoard on 01/20/2011 at 12:48 PM6 comments
The long-standing, conventional wisdom has always been that when it comes to the IT market place, no one company can do it all. While this will probably always be true on some level, you have to give Embotics credit for taking a pretty full-featured shot at the do-it-all mantle with their newly unveiled V-Commander 3.7, which the company ambitiously claims provides end-to-end capabilities for managing virtual infrastructures and private clouds.
When I recently spoke with Jason Cowie, Embotics VP of Product Management, he enumerated a long list of V-Commander 3.7 capabilities, including embedded best practices via a self-service portal that standardizes VM requests; integration with vSphere and vCenter via a vCenter plug-in that enables real-time, system-wide management capabilities from the proverbial single pane of glass; and the ability to set expiration dates for VMs, which are never left alone to cause problems during their short stay here on Planet Earth. When their time is up, ZAP!, they are decommissioned and dismissed without prejudice.
Of course, these VMs can never be created in the first place if they fail to pass muster in terms of policy-based compliance requirements. Once they are up and running, V-Commander protects them like a mother lion guarding her cubs, ensuring, via the system's capacity management feature, that all virtual assets are evenly balanced and provisioned. If they become even a little off-kilter, a fine-tuning capability nudges them back to center.
Loathe though I be to reproduce or give comfort to canned analyst quotes, when the canned analyst is the redoubtable Gartner Research VP, Chris Wolf, I stand down. In his rather lengthy statement, he endorses Embotics' goals with V-Commander, and notes the importance to users of solutions that intelligently run the gamut of lifecycle requirements from provisioning to retirement. Chris goes on to declare, "Integrated marketing stories that represent numerous disjointed products no longer cut it. Our customers are demanding tightly integrated automation stacks to move forward with their cloud plans."
V-Commander--which is complementary to the VMware platform--costs $299 per host CPU socket.
Posted by Bruce Hoard on 01/18/2011 at 12:48 PM1 comments
Although the big acquisition buzz yesterday was about NetApp looking to widen its value to virtual infrastructures via its purchase of Akorri, I was more intrigued by Cisco's announcement that it had completed its acquisition of LineSider Technologies, a low-profile company founded in 2005 that got a head start on the cloud craze with its OverDrive product, which automates network services so the network becomes "fluid and responsive" to user demands.
More specifically, LineSider says OverDrive streamlines cloud environments and reduces network operation costs by ensuring that applications receive the network resources they require any time by provisioning them without costly, invasive, manual intervention. The company employs a policy-based approach which "automatically responds to pre-determined business scenarios and provides maximum performance across networking products and service delivery platforms." This approach eliminates the need to burn time and money every time an application, computer or storage resource changes in virtualized data centers, which is frequently, since they are typically characterized by dynamic movement between physical locations or network segments.
Just as NetApp was looking to become more competitive in virtual environments, so too was EMC in May of last year when it announced a collaboration with LineSider aimed at automating EMC's Atmos cloud computing infrastructure. At the time, EMC was full of predictable praise for its new business partner. It would be interesting to know why the relationship soured and EMC didn't eventually buy LineSider, which some seven months later was apparently more highly prized by its arch competitor, NetApp. Using a baseball metaphor, it's like trading one of your best prospects to another team in your division.
According to LineSider, Cisco will sell LineSider as a Cisco product, and existing customer products will continue to be serviced as they are today. Cisco will also continue to fulfill the terms if LineSider's current customer contracts.
Posted by Bruce Hoard on 01/13/2011 at 12:48 PM1 comments
The new year brings with it change as long-time Everyday Virtualization blogger Rick Vanover departs, and newcomer Elias Khnaser arrives with his Virtual Insider blog. We hate to see Rick go, because he has built a large and loyal audience that relies on him for technical insights on a wide range of virtualization topics, but we wish him the best at Veeam, where he is now a product strategy specialist.
Actually, we won't be losing Rick entirely, as he will continue contributing his thoughts on a monthly basis. Elias Khnaser has a wealth of technical experience that he has applied as a blogger for both Forbes and InformationWeek. In addition to blogging for VR twice weekly, he will also write a monthly how-to blog, and a bi-monthly column for our print magazine. The good news for you as readers is that you will continue to receive nuts and bolts insights on virtualization technologies that impact your day-to-day work lives.
Posted by Bruce Hoard on 01/11/2011 at 12:48 PM3 comments
Lately, in the course of talking to startups who are hoping to succeed by storing and processing information in the cloud, I have discovered that some companies are trying to overcome security concerns by encrypting everything that goes back and forth between data centers and clouds. It's an approach that makes sense and it should relieve some potential cloud customers who are still unwilling to let their data leave the premises.
Despite their enthusiasm, however, these cloud storage and application processing vendors, who rely on various cloud service providers to maintain the security of their customers' data, are painfully aware that they must choose their service provider partners very carefully.
Writing in the VMware vCloud blog, Steve Jin of VMware R&D discusses cloud brokers, who provide a single point of contact and management for multiple cloud service providers, while maximizing the benefits of leveraging multiple external clouds.
Comparing cloud brokers to brokers of financial instruments, Jin cites the convenience of not having to worry about placing orders and working with multiple stock exchanges, and asks users to ask themselves if service providers fit their requirements. He also advises users to have backup plans to protect them if they are not satisfied with their providers. Along those lines, Jin continues, users should also find out if they can easily and cost-efficiently switch among cloud service providers.
If users are not happy with the answers to these questions, they should consider a cloud broker, which Jin describes as "Software that helps users and companies get the benefits of external cloud services. Depending on your requirements, it could be offered as a product so that you can install it inside your enterprise or as a service for which you pay as you go."
Jen suggests deploying the cloud broker as a service as opposed to a product because users require the most recent market data to make the best customer decisions and they don't want to have to keep updating a product.
What capabilities should cloud brokers have? They must work transparently with multiple cloud services providers to manage system monitoring, provisioning, billing, among other functionality. They should also help users avoid service provider lock-in by shifting workloads among service providers, and thus maximizing performance/price rations of cloud services. Finally, endorsing the idea of scaling virtual machines beyond low-resource providers, he declares, "Every service provider has a limit which you just don't hit normally."
What's difficult is providing a unified way for customers to use different service providers, whether they be Amazon, Rackspace, Terremark, or another company. "The tricky part of the challenge is while searching for the best deals among the various services, you want to keep the key differentiators of the providers so that you can leverage their comparative advantages when needed."
While noting the market for cloud brokers is still in a nascent, undeveloped state, Jin does recommend one product, Appirio CloudWorks. At least for now, it's clear that buyers in the emerging cloud broker market have a big advantage over sellers.
Posted by Bruce Hoard on 01/10/2011 at 12:48 PM0 comments
With its acquisition of AVIcode, a privately owned Baltimore firm reknowned for its .NET application performance monitoring capabilities, Microsoft joins a rapidly growing number of companies that are emphasizing the importance of looking behind complex systems to better understand and more quickly respond to problems relating to them. Other companies offering highly sophisticated monitoring capabilities include Netuitive and CiRBA.
Specifically, writes Microsoft Corporate VP Brad Anderson in The System Center Team Blog, Redmond is extending its monitoring capabilities in light of the fact that it is delivering more and more solutions in the forms of SaaS (online) and PaaS (for Windows Azure). Anderson, like so many other vendors in and around the VDI and cloud markets, is concerned about how users experience application quality and performance. In order to gain that understanding, Microsoft--which has been using the AVIcode solution along with Operations Manager in its far-flung datacenters for multiple years on services such as Xbox Live--hopes AVIcode will enable users to closely follow the performance of critical business transactions and better understand how the hardware and software components of their distributed applications or services interact.
According to Anderson, "As more and more applications move to run from the cloud, organizations will want to have access to the capabilities that AVIcode delivers--enabling organizations to get a much deeper understanding of the actual end-user experience, with the details to understand when performance and availability is not at the desired service level, and quickly diagnose where the root issues are that lead to latency within the service, resulting in a poor performance experience for end users."
Anderson goes on to say AVIcode strengthens Microsoft's intentions to operate in private, data center clouds, or public clouds based on Azure. Microsoft customers will benefit from AVIcode technologies because they will be able to more easily ensure that their data centers realize maximum uptime. These technologies, says Anderson, will be integrated into System Center "over time."
Posted by Bruce Hoard on 01/05/2011 at 12:48 PM6 comments
Just in case you feel guilty about tossing that piece of litter out your car window, or driving a three-ton SUV, you can perk up by purchasing the new HP t5550, t5565 or t5570 thin clients. According to HP, the t5550 family is so environmentally conscious that it is has been recognized as the industry's first EPEAT Gold registered thin client (meaning they have met 23 required environmental performance criteria). Moreover, in addition to being Energy Star qualified, these new devices are free of Brominated Flame Retardents/Polyvinyl Chloride, and their case parts contain more than 30 percent post-consumer recycled plastic. Al Gore probably has a ton of these babies in his new, more energy-efficient mansion.
Brominated Flame Retardents aside, HP is touting the t5550's new HP Easy Tools, which Tad Bodeman, America's Thin Client Category Manager, says avoid vendor lock-in by including pre-installed VMware, Citrix and Microsoft plug-ins, and enable customers to fire up and configure their t5550 devices in less than five minutes with the help of a guided wizard and four simple steps. The end result is a cost of $249 per thin client.
Bodeman also says these new thin clients benefit from the availability of an integrated wireless configurability option that was previously restricted to higher-end products. "This commercially battletested wireless capability is ready to go," he notes, adding that HP also offers the t5550s with dual digital monitor support.
Bodeman further points out that VDI is a productive platform for the t5550 series, and that the Personal Systems Group--which he says is growing at a rate of 30 percent annually--views VDI as "a key opportunity" for future growth and profitability.
The new HP Thin Clients also include the SuperScalar VIA Nano u3500 CPU and VX900 Integrated Graphics Processor, which offers hardware-assisted multimedia decoding that HP says rivals traditional PCs. When combined with the standard dual digital monitor support, six USB ports and the HP Universal Print Drive Driver, the t5550s provide the "rich, PC-like user experience" that is all the rage in VDI environments today.
Posted by Bruce Hoard on 01/03/2011 at 12:48 PM1 comments
As we march inexorably toward the cloud, the pressure is on to provide a seamless computing environment between datacenters and cloud-based storage and applications. Toward that goal, F5 is offering major enhancements to its Data Solutions portfolio with the introduction of the iControl File Services Solution open storage management API, the ARX Cloud Extender, and the ARX Virtual Edition (VE) appliance.
F5 says the iControl file services API enables software vendors and customers to integrate ARX's file virtualization capabilities to enhance their existing data management solutions and enable entirely new applications. According to F5, "Via the API, customers can use ARX's real-time change notification capability to improve the scalability, responsiveness and efficiency of a range of third-party applications, such as search, index, backup, audit and quota management tools."
By extending ARX's intelligent, automated storage tiering capabilities to support cloud storage services, F5 says it has made it easier to seamlessly integrate cloud storage into existing IT infrastructures. Customers now have simplified data access, because files stored in the cloud are presented as if they reside locally in the data center. "In order to maximize the amount of data tiered to the cloud while minimizing IT overhead and access costs, customers can automatically identify and migrate appropriate files to cloud storage. The new solution is fully qualified for use with Amazon S3, Iron Mountain VFS cloud storage, and NetApp StorageGRID," the company said.
With the introduction of ARX VE, the solution will be available as a virtual appliance in addition to physical, hardware-based offerings. This extends the benefits of ARX beyond centralized data centers, and provides a variety of flexible, deployment options. "Using ARX VE, branch offices and smaller organizations can enjoy performance benefits of the ARX product line that have previously been associated with larger, physical deployments," F5 said. "ARX VE will be available via three distinct deployment options: a trial version for demonstrating purposes, a full version suitable for production environments, and through OEM partners."
Posted by Bruce Hoard on 12/16/2010 at 12:48 PM0 comments
Companies such as CloudSwitch, Nasuni and Riverbed have an increasingly compelling story to tell as storage and enterprise apps move progressively toward the cloud. No matter what its content, the ability to securely transmit and receive encrypted data without the fear of having it hacked sounds better every day to senior IT management.
The newly released v2 of CloudSwitch Enterprise is a downloadable software appliance that installs easily into VMware and Xen environments in 20 minutes (What, no Hyper-V?). Its beauty is in its ability to securely simulate data center environments—including management solutions—in the cloud, making interaction transparent to users and developers. As the company puts it, "CloudSwitch eliminates the engineering efforts and changes to applications, networking and management tools that were previously required to use the cloud, protecting customers from lock-in."
Via a simple point and click, v2 provisions new Windows and Linux applications in the cloud without any modifications. This is accomplished through network boot support and ISO support (CD-ROM/DVD). v2 also provides Web services and command-line interfaces for programmic scaling to meet peak demands. In addition, it eliminates network bottlenecks by "extending internal network topology into the cloud and allowing secure, public IP access." Also new is broader geographic coverage, including Terremark xCloud Express & eCloud, Amazon EC2 East, West, EU and Asia Pacific regions.
CloudSwitch, which is releasing v2 just six months after v1 debuted, is counting on making a hit with IT via its ability to get face time with strategically oriented CIOs and senior managers on the application development side of the house. According to CloudSwitch co-founder and VP of Products Ellen Rubin, the company’s strong security story also resonates with top security officers, who are understandably leery of cloud security. "We really try to enable our customers to lock things down," Rubin says, adding that CloudSwitch, which is funded by Boston-based venture capital firms, currently has a dozen customers, including two large pharmaceutical firms and a large telco service provider.
CloudSwitch Enterprise 2.0 is currently available with a free 15-day trial. Pricing begins at $25,000 for an annual license, including basic support and up to 20 concurrent virtual machines under management in the cloud. Available server packs are available for scaling. Cloud usage fees are paid separately to the cloud provider, and additional pricing for scaling cloud deployments is available.
Posted by Bruce Hoard on 12/13/2010 at 12:48 PM0 comments
Although cloud computing market leaders like Microsoft, VMware and Citrix are touting the advantages of Infrastructure as a Service for the growing ranks of their cloud customers, a Yankee Group Focus Report entitled "2010 FastView Survey: Cloud Computing Grows Up" makes it clear that IaaS--like Platform as a Service--is still very much an emerging technology that has yet to meet the acceptance found by Software as a Service in IT organizations.
IaaS is based on the concept of combining all servers and storage in a giant pool of resources and allocating them on-demand to applications via private clouds in the datacenter, or shared, public clouds offered by service providers. Despite the benefits of IaaS--including converting capex to opex, paying only for resources consumed, and flexibly scaling resources up or down based on current business requirements--early adopters are cautiously implementing it for "relatively tactical server/storage capacity issues."
However, as Yankee Group learned from its survey of large enterprises with 500 or more employees that have deployed or are considering some kind of cloud solution, things are looking up for IaaS. Specifically, 20 percent reported that they have already implemented IaaS, and among that group, companies with 10,000 or more employees have the highest percentage of current adoption at 24 percent. The report goes on to note:
"An additional 37% are expecting to adopt IaaS some time in the next 24 months, and 60% are evaluating solutions for the near term (i.e. looking to adopt IaaS in less than 12 months). Despite this activity, there remains a segment of the enterprise population (16%) that still has no plans to adopt IaaS, even though they are interested in or already use other types of cloud services."
Many people would be quick to cite security as the primary obstacle to IaaS and cloud adoption, but the Yankee Group notes that five other obstacles are more frequently mentioned. They include "Migrating existing data and applications to the cloud could be costly and difficult," "Regulatory compliance/corporate governance," "Employee resistance," "Lack of measureable business benefits" and "Reliability/availability of cloud platforms."
A majority of survey respondents (29 percent) consider systems integrators to be the most trusted partners for cloud computing, but when early IaaS adopters were asked who they viewed as their most trusted partners, telecom companies came out on top, garnering 33 percent of responses. Near-term IaaS adopters (less than 24 months) viewed both datacenter providers and SIs as their "best-positioned providers."
Among the conclusions and recommendations cited by Yankee Group, suppliers that help enterprises navigate the evolution of integrating IaaS with their legacy IT infrastructures were mentioned as most likely to succeed against their competition. Overall, the report states, "To foster further adoption, vendors and service providers must adapt their solutions to address enterprise concerns, especially those held by early adopters." It goes on to recommend that service providers work to ease data migration issues, foster interoperability and offer an evolutionary approach to the cloud.
Posted by Bruce Hoard on 12/09/2010 at 12:48 PM3 comments
Microsoft recently got in touch with me to see if they could write some kind of year-end story for
Virtualization Review about their Server and Tools Business (STB). I told them I would be interested in a brief chronological history of 2010 highlights, and they agreed.
As you can see below, 2010 was a big year for the cloud in Redmond, starting with the GA of Windows Azure and SQL Azure in February, and ending in November with a Hyper-V Cloud program for private cloud deployments. In between, Steve Ballmer committed to the cloud and STB introduced Office 365, among other initiatives.
Microsoft also said it had 10,000 Azure customers in September, and it is looking to adding more rapidly in 2011 as it locks horns with VMware in a battle for market leadership.
Top news for Cloud/Virtualization in 2010:
FEBRUARY -- Microsoft announces Windows Azure GA (Official MS Blog, Windows Azure Team Blog)
MARCH -- Steve Ballmer's speech at University of Washington about the company's commitment to cloud computing (video): Ballmer plants the stake in the ground for Microsoft's cloud efforts moving forward
APRIL -- Bob Muglia highlights the importance of management as organizations think about cloud computing at MMS 2010 (keynote video)
JUNE -- Bob Muglia explains the process of delivering cloud to business customers at TechEd North America 2010 (press release, keynote video):
- Windows Server 2008 R2 SP1 announced with new, improved virtualization tools for WS08 R2 and Win7
- Windows Azure and SQL Azure feature updates provide better capabilities for cloud customers
JULY -- Guidance for Microsoft's partners on how to take advantage of cloud computing at WPC 2010 (press release, keynote video): Windows Azure platform appliance helps bring the power of Windows Azure to customer and service providers' own datacenters
OCTOBER -- Office 365, a cloud-based business productivity suite, is launched (press release, Q&A feature, video): Office desktop software, Office Web Apps, SharePoint, Exchange and Lync now available in the cloud
Developers learn about how to create for the cloud at PDC 2010 (press release, video, blog post):
- Windows Azure roadmap rolled out, including a number of platform updates
- Windows Azure Virtual Machine Role and Server Application Virtualization enable customers to get on the path to platform as a service
NOVEMBER -- Advancements to private clouds are announced at TechEd EMEA (press release, keynote video): Hyper-V Cloud program debuts, including programs for private cloud deployments and incentives for partners
Bob Muglia summarizes a busy fall for Microsoft's Server and Tools Business, spanning major events such as PDC, TechEd EMEA and PASS (Q&A feature)
Microsoft announces NCBI BLAST on Azure at Supercomputing 2010 (press release)
Battle-testing the scalability of Windows Azure, NCBI BLAST on Azure puts the power of cloud computing in the hands of scientists and researchers.
Posted by Bruce Hoard on 12/07/2010 at 12:48 PM2 comments
The more companies adopt virtualization, the more they need to understand what is going on behind the scenes in their virtualized environments. That need is driving the growth of companies like Akorri, which is currently touting its success with Electro-Motive Diesel (EMD), the world's largest provider of diesel-electric locomotives.
According to Bill Bradford, EMD Global Director of Infrastructure and Applications, the company did an infrastructure refresh of its storage, servers and networks in 2007 as part of its commitment to a server virtualization strategy that includes 90 percent of all the company's servers. Bradford says EMD was able to get between 10 and 30 VMs per physical host, which was good, but all that VM density made it difficult to monitor and maintain available storage and server resources.
"People were having a hard time getting reports to answer their questions," Bradford says.
All that has changed now that EMD has implemented Akorri's BalancePoint software, which Akorri claims is "the only analytics-based IT management software solution on the market designed to optimize performance and utilization of virtual machines, physical servers and storage resources." The Akorri product is agentless and can be purchased as a virtual appliance. It provides multi-vendor, heterogeneous support for VMware vSphere and Microsoft Hyper-V, and is compatible with major storage vendors such as EMC, NetApp, HP, Dell 3PAR, IBM, Hitachi Data Systems and Dell EqualLogic.
Rather than conduct a formal search to meet its product research needs, Bradford says "We just asked around and Akorri is what came back. We put it in and sure enough it told us how many virtual servers were being used. It makes it very easy for us to display analytic reports." In addition, he adds, the package enables tech people to dig down in CPU, storage and I/O levels, allowing them to tweak those levels as required. EMD now gets weekly storage utilization reports that help the company tighten up its IT performance.
Although EMD has not measured a hard-dollar ROI figure, Bradford says that with BalancePoint, he has saved some $250,000 that would have otherwise been spent on additional storage resources. "We like the tool," he says. "We're not looking for anything else right now."
Posted by Bruce Hoard on 12/02/2010 at 12:48 PM1 comments