Categories
General

Why HP will prevail over Dell in the 3Par bidding war

The Twittersphere and storage industry are abuzz with the ongoing bidding for 3Par.  HP and Dell are aggressively pursuing the company and have a vested interest in 3Par technology. I believe that HP is more motivated to acquire 3Par and will prevail.

Both Dell and HP believe that a 3Par acquisition will generate additional business value, and they both must realize that the losing party will be placed in a difficult situation.  I believe that HP has more to gain by acquiring 3Par and more to lose by failing to do so.  Here is my assessment of the gains and losses by each bidder:

Categories
Backup General

The challenge of data growth

One of the biggest challenges with data protection is managing growth.  Some of the common factors that drive increasing capacity requirements include:

  • Intrinsic growth – Growth inherent in the environment as users create new data.
  • New applications – Companies implement new applications to meet changing business requirements.  These solutions could replace existing technologies or could be net new additions.  Either way, they often generate more data to protect and retain.
  • New data types – In today’s multimedia-centric world, there has been a dramatic increase in the number of audio, video and image files being created and protected. These files are much larger and more difficult to compress than traditional content.
  • Merger & Acquisition – As M&A activities occur, the acquiring entity must expand their IT infrastructure to absorb the acquired systems and processes.

Categories
Deduplication

Deduplication Strategy and Dell/Ocarina

This week, Dell acquired Ocarina, a provider of primary storage deduplication.  The acquisition provides technology that they can integrate with existing storage platforms such as EqualLogic.  However, Dell also sells deduplication technology from EMC/Data Domain, CommVault and Symantec.  Dave West at CommVault suggests that these technologies are complementary, and I agree. However, the announcement raises a significant strategic question – which is a better deduplication strategy, “one size fits all” or “best of breed”?

Deduplication is an important technology in the datacenter and reduces power footprint and cooling requirements.  However, it typically brings a performance trade-off during read or write operations due to the additional processing required to re-hydrate or deduplicate data.  The benefits of the technology are compelling and we have seen multiple large companies promote different deduplication strategies.  Their approaches fall into two broad categories:“best of breed” (BoB) or “one size fits all,” (OSFA) and the choice of approach has a major impact.  Let’s look at each strategy individual.

Categories
Backup

Will dedicated VMware protection solutions go the way of CDP?

I previously posted a survey highlighting the different methods of protecting VMware environments.  The responses suggested that host-based backup is the predominant approach.  The least popular choice was “Dedicated VMware backup application (Veeam, Vizioncore, etc..)”.  These solutions exclusively protect virtual environments and they remind me of continuous data protection (CDP) technologies from the past.

Three years ago, CDP was hot.  It was a major industry buzzword and several companies were founded focusing exclusively on technologies that claimed to enable CDP functionality.  CDP enabled instantaneous backup, recovery and roll-back of critical data and some predicted that it would replace traditional data protection.  CDP upstarts made voluminous statements about the technology and the future, but they had miniscule installed bases particularly when compared to the traditional backup application vendors.  The challenge for the CDP providers was convincing end users to replace or augment existing backup infrastructures.  This was a challenge since end users had substantial investments in backup software, hardware and knowledge.  Although CDP provided customer value, it was only practical as a complementary solution to traditional backup and CDP functionality should have been embedded in existing backup applications.  As a result, most dedicated CDP companies were either bought or went away, and we now see backup ISVs including CDP functionality.

Categories
Deduplication

Storage pools and why they matter

Today SEPATON announced the addition of Storage Pools to our data protection platform.  The technology marks a major step in the path to data protection lifecycle management, and I am excited about the new functionality and wanted share some brief thoughts.

To summarize, storage pooling allows data to be segmented into discrete pools that do not share deduplication.  Data sent to one pool will only be deduplicated against information in that pool and will not co-mingle with other data.  Additionally, pools provide configuration flexibility by supporting different types of disks with different performance profiles.  Pools also benefit from SEPATON’s DeltaScale architecture which allows for dynamic capacity and performance scalability.  Pools are a no-cost option with our latest software release and customers have the ability to implement them in the way that best meets their business requirements.  Some of the benefits include:

Categories
Physical Tape

The future of physical tape

Chris Mellor over at The Register posted an article discussing Santa Clara Consulting Group’s (SCCG) recent forecast of the physical tape market.  In summary, SCCG’s latest analysis indicates that physical tape sales (both media and drives) decreased 25% in 2009 and 7% in 2008.  Some may suggest that this accelerating decline is a sign that tape is dead.  I respectfully disagree. Tape still plays an important role in data retention and archival and will be used for years to come.

There are some bright points in SCCG forecast.  They suggest that LTO drive revenue will grow at a 2.47% compound annual growth rate (CAGR) through 2014 while tape revenue will decline by a 2.21% CAGR. Clearly they believe that LTO will continue to dominate the market and outperform all other formats.

Categories
Backup Restore

Agent-based VMware Backups

My last blog post contained a poll asking visitors about their primary VMware backup methodology.  The survey listed the common approaches to protecting virtualized environments including traditional agent-based,  VCB/VADP, dedicated VMware backup application, snapshots and doing nothing.  The results suggest that that the agent-based approach is most commonly used.  I anticipate that end users will migrate to backup methodologies that support VMware’s VADP functionality, but believe that there will always be a subset of people who rely on the agent-based approach. When implementing the agent-based approach, you should consider the following:

Categories
Backup

Poll: VMware backup methodology

Server virtualization is a very powerful technology that can improve the economics of the datacenter.  However, it also creates new challenges for data protection.  VMware’s Vsphere API for Data Protection (VADP) improves the situation, but there still are multiple backup and recovery options.  Which do you use?

My primary backup methodology for virtualized servers is:

View Results

Loading ... Loading ...
 
Categories
Backup D2D Restore

Boost vendor lock-in

A couple of weeks ago, I blogged about the benefits of Symantec’s Open Storage Technology (OST). The technology enables accelerated disk-to-disk backups (D2D) primarily over IP connections and additional value-added features. Last week, EMC responded with their announcement of BOOST for NetWorker. Insiders have told me that the BOOST architecture is essentially the same as OST although the go-to-market strategy is very different. Of course a major difference is that OST has been shipping for over 3 years and BOOST will not be available until sometime in the second half of 2010.

As discussed previously, EMC/Data Domain was unable to create a true global deduplication solution so were forced to use OST to do the heavy lifting. Ironically, they could only support Symantec NetBackup and BackupExec with the new feature because NetWorker did not offer an advanced D2D interface. The BOOST announcement addressed the issues, but raises new questions. Specifically, BOOST is positioned as an EMC only solution, and it is unclear if the API will be shared with other vendors. In my opinion, this creates a challenge for EMC/Data Domain and NetWorker. Let’s look at how the situation impacts a variety of interested parties.

Categories
Backup Restore

Data protection storage and business value

George Crump posted an article over on Network Computing discussing why storage is different for data protection. He makes a number of points regarding the benefits of using a storage appliance approach versus a software-only model, and for the most part, I agree with his analysis. However, there is an important point missing.

The software-only model relies on a generic software stack that can use any hardware or storage platform. This extreme flexibility also creates extreme headaches. The software provider or ISV cannot certify every hardware and environment combination and so the customer is responsible for installing, qualifying and testing their system. Initial setup can be difficult, but support can be even harder.

What happens if the product is not performing? The support complexities become difficult. Do you call your software ISV, your storage vendor, your SAN provider, your HBA vendor? There are a myriad of different hardware pieces at play and the challenge becomes how to diagnose and resolve any product issues. This is less of a problem in small environments with simple needs, and rapidly becomes an issue as data sizes grow.