NetApp’s initial bid for Data Domain came as a surprise to many. EMC’s counter was even more of a shock. These discussions have very important implications for data protection and deduplication. Two thoughts immediately come to mind:
It’s hard to do deduplication well.
EMC and NetApp say that they have robust deduplication solutions in their DL3D (Quantum technology) and NearStore VTL series products. Before these negotiations, you might have believed them. Now, they are both bidding aggressively on Data Domain. What does that say about their confidence in their own solutions? Remember, these are large companies with hundreds (thousands?) of engineers with storage experience. Why wouldn’t they just build their own deduplication technology? The simple answer is that developing really good, enterprise-class deduplication technology is difficult.
The sheer size and complexity of large enterprise backup environments makes deduplication significantly more challenging than for small SME backups. Unlike enterprises, smaller environments have lower expectations for performance and capacity; simpler backup, retention, and restore requirements; and little need for global dedupe or huge single-system capacity scalability. Data Domain focuses on the smaller environments and whoever acquires them will not have a solution for large enterprises.
Deduplication is a must-have technology for the enterprise.
Many large vendors have tried to downplay deduplication as a simple feature. Ironically, it is this feature that built Data Domain. Clearly deduplication is invaluable. In fact, it is so important that EMC and NetApp are locked in a battle over it.
I believe that these two points are the key implications for this bidding war. In the future, I will provide additional thoughts about the winner.