Categories
Backup Physical Tape Replication

Perspectives on Symantec OpenStorage

A couple of weeks ago SEPATON demonstrated OpenStorage (OST) at Symantec Vision and I posted a blog entry including a link to the demo. I wanted to explore OST in more detail.

OST is Symantec’s intelligent disk interface. It works with all types of disk targets and is most commonly implemented with deduplication enabled storage. OST addresses disk as disk and is different from the traditional tape-based metaphor. It handles backups as images and allows the backup application to simultaneously read and write data and incrementally deleted expired information. OST enables access to NetBackups native disk features such as San Client Backups, Media Server Load Balancing, Intelligent Disk Capacity Management and Storage Lifecycle policies. These are features of NetBackup that can benefit end users and are outside the scope of this blog. In this post, I want to discuss the features that are unique to OST.

The challenge that end users grapple with is how to move or transform data using their backup appliance while maintaining NetBackup (NBU) catalogue consistency. This can be a particularly difficult when using appliance-based tape copy or replication. OST addresses these issues by enabling the appliance to access the NBU catalogue. This means that NBU can instruct the appliance to replicate a copy of the data and maintain separate retention policies on the two copies. Let’s look at these features in more detail:

Categories
Backup Deduplication Replication

Deduplication ratios and their impact on DR cost savings

There is an interesting blog discussion between Dipash Patel from CommVault and W. Curtis Preston from Backup Central and TruthinIT regarding the increasing or decreasing benefits of deduplication ratios. They take different perspectives on the benefits of increasing deduplication ratios and I will highlight their points and add an additional one to consider.

Patel argues that increasing deduplication ratios beyond 10:1 provides only a marginal benefit. He calculates that going from 10:1 to 20:1 results in only a 5% increase in capacity efficiency and suggests that this provides only a marginal benefit. He adds that vendors who suggest that a doubling in deduplication ratios will result in a doubling cost savings are using a “sleight of hand.” He makes an interesting point, but I disagree with his core statement that increasing deduplication ratios beyond 10:1 provides only marginal savings.

Categories
Replication

Introducing DeltaRemote

With all the recent hype, you may have missed that SEPATON launched DeltaRemote a couple of weeks ago. DeltaRemote is a software upgrade for existing DeltaStor users and enables deduplicated replication between SEPATON VTLs. Some of the new features include:

  • Multi-node support – DeltaRemote leverages SEPATON’s DeltaScale architecture to use multiple nodes for replication. It’s fast and concurrent just like DeltaStor.
  • Fast restore performance at the remote site – I have discussed in the past how DeltaStor has some unique features to enable industry-leading restore performance. The same technology has been extended to the VTL on the remote site.
  • Simple management –Manage replication through SEPATON’s existing GUI. Detailed reporting and 30 day bandwidth efficiency analysis make planning and optimization a snap.
  • Cartridge level control – DeltaRemote provides complete tape cartridge level control of replication and recovery. You can easily set replication policies or manually choose cartridges to replicate or recover in the same format as tape libraries.

Stay tuned for more detailed information on DeltaRemote.

Categories
Deduplication Replication

Recent Comment

Recently an end user commented about how the replication performance on his DL3D 1500 was less than expected. As he retained more data online, his replication speed decreased substantially and EMC support responded that this is normal behavior. This is a major challenge since slow replication times increase replication windows and can make DR goals unachievable.

The key takeaway from the comment is that testing is vital. When considering any deduplication solution, you must thoroughly review it with limited and extended retention. In this case, the degradation appeared when data was retained and would not have been found if the solution was tested with limited retention. The key elements you should test include:

  1. Backup performance
    1. On the first backup
    2. With retention
  2. Restore performance
    1. On the first backup
    2. With retention
  3. Replication performance
    1. On the first backup
    2. With retention