Faster Backups – Local. Offsite. Remote Office.
Why Should You Care?
According to a recent survey from the IDG Research Group, the biggest challenge facing IT managers
responsible for backup is meeting their backup windows.
Quickly growing data sets, a variety of applications and IT assets, and multiple remote locations are
causing backup windows to be missed by 4 hours or more 30% of the time according to the survey.
Backups that extend passed their window and don’t complete successfully are a major risk, especially
for organizations that rely heavily on mission-critical data for their operations.
The backup performance problem is especially challenging with offsite backups. Whether you’re using
tapes, replication software and remote servers, cloud backup or appliance-based hybrid backup, the
speed challenge is still there.
How To Improve Performance and Make Backups
There are 3 components to faster backups: backup less data daily, move data offsite faster, and store
and recover data faster from remote servers or datacenters.
1. Backup Less Data Daily – In a normal workday, the amount of changed data is less than 2%
of a company’s total data according to the average change rate of Zetta customers. So, if the backup software is transmitting anything more than the new incremental change data, it adds unnecessary time to backups and consumes network bandwidth, especially when it’s being sent over the
There are three techniques for addressing this challenge that can make backups faster:
a. Local cache of digital signatures.
Backup software that has a local cache of your backup data’s digital signatures can compare
today’s changes to the version in yesterday’s backup before copying any files. This local manifest
is much smaller than the original backup set, and with a lightweight software client will allow the
server or laptop to continue running without drag.
b. Byte-level change detection.
Once the software detects the files that changed since yesterday, it determines the delta between
the last backup and the current state of those files at the byte level. This significantly cuts the
total amount of data transmitted and cuts backup times by significant percentages.
Most backup software products use change detection in a way that was originally intended for
backing up to tape. This approach groups all changes together and backs them up together. Better results come when the software has been built to work with enterprise-grade cloud storage
from the beginning, patching changed byte-level blocks into individual files in the backup copy, to
bring it current with your production data.
Zetta’s backup software replicates the file system of a backed up server in Zetta’s datacenters
so there is a full replica of your server’s file system— in its native format— in the cloud.
If backup software or an appliance has to communicate with remote servers via LAN or over the
Internet to determine file change information before transmitting the backup data – it will significantly bloat the traffic and slow down the backup process.
Having a local catalog for changed data (since the last backup) eliminates a huge amount of
back-and-forth communication and significantly reduces backup times.
To illustrate how much of a difference the local cache of digital signatures makes, a private college with millions of small files in their backup was recently able to shorten backup time from 24
hours using a leading backup software vendo — to 90 minutes using Zetta’s byte-level change
detection technology with a local catalog of digital signatures.
Compression is the last tactic for reducing the amount of data in the daily backup. Advanced data
compression is a way of encoding files that eliminates statistically redundant bits without losing
any of the original data. Zetta’s advanced compression can achieve up to a 3:1 reduction in file
size, depending on the type of file.
The other aspect of file reduction is daily snapshot versioning. A snapshot is a point in time copy
of your backup data, a read-only file that’s much smaller than it would be if the same data was
2. Move Data Offsite Faster
– Sending data offsite is a major limitation of most backup solutions because the dominant majority of backup software and appliances use single-threaded data
transfer and limited proprietary communication protocols that are not WAN optimized.
To maximize the speed of moving data offsite your backup software has to be WAN optimized.
WAN optimization is achieved using the incremental forever methodology, where you only ever upload one full backup. Then, all following daily incremental backups are merged with the full backup.
This means the previous day’s version is a synthetic full backup, and no more uploads of full backups
WAN optimization also includes data transport that is multi-threaded,
as opposed to single threaded. Most backup software is architected
to copy data serially, and is significantly latency sensitive, meaning
that a 100mbit connection only yields 20mbit of usable backup
throughput. This is the reason moving files via FTP from a remote
server doesn’t maximize available bandwidth.
Storing and Recovering Data Faster – The architecture and
infrastructure of the cloud datacenter where your data is stored plays a
major role in how fast the backup completes and how fast files can be
recovered. Today’s technology allows anyone with a server in a closet to
call it a datacenter, but real companies know to look for enterprise-grade
software utilizes all
resources, and is in
with customers up to
The key datacenter requirements for companies that are serious about protecting data are:
n Multiple Geo-diverse datacenters are a must when superstorms threaten the entire east cost of the
US. Also, using a backup solution that replicates to multiple datacenters is more cost-effective than
leasing rackspace in multiple co-lo datacenters.
n A redundant backend file infrastructure that can scale up and down
quickly, and without file count or size limits gives you the flexibility your
company will need in the future.
n Knowing which backup and recovery actions in the datacenter are
handled by SSD vs. RAM vs. HDD is a critical detail affecting how fast
your requests will be completed. For example, part of the reason Zetta DataProtect works so fast is that 80% of requests to the datacenter
are handled using SSD or RAM. Cloud datacenters with infrastructure
that relies on hard drives may not keep up when a major recovery is
Zetta is an
New Jersey and
Here’s a faster backups feature checklist to use when evaluating vendors:
Backup Less Data Daily
dd Sub-file / byte-level change detection
dd Local File (Manifest) Change Comparison
dd Advanced Data Compression
Move Data Offsite Faster
dd WAN optimization
dd Multi-Threaded Data Transport
Store and Recover Faster
dd Multiple geo-diverse datacenters
dd Redundant Datacenter Infrastructure
dd SSD or RAM Primary Storage
Zetta DataProtect For Faster Backups
Zetta’s 3-in-1 backup solution is called DataProtect and it incorporates all the features described above
to deliver faster backups on-premise, offsite, and for remote offices.
Most of Zetta’s customers comment on how critical the speed of backups is for them—how it’s necessary in order to complete backups on time and not exceed backup windows.
The best way to see the difference in backup performance is to try the product. You can start a free
trial by following this link. It’s free for 15 days and has no limitation on data size, types or bandwidth
Backup Performance Examples from Zetta Customers
n A software company who was already backing up >50TB wanted to add an additional 7.7TB to
their Zetta volume. Their internal firewall was throttled at 320Mbps and they were able to complete their initial backup of the new data in less than 48 hours because of Zetta’s enterprise
grade datacenter infrastructure.
n Another example comes from an advertising agency with 150Mbps of bandwidth. During a
recent surge in data their transfer speed averaged between 145-150Mbps, fully utilizing
their large pipe by writing that data to SSD and RAM.
Missing nightly backup windows is an issue that can have ramifications throughout your company – from
unavailable versions of files to slow network performance during the work day from backups still trying to
complete. Zetta DataProtect utilizes over 10 patent-pending technologies to speed up your backups,
eliminating the issue of missed backup windows both today and as the amount of data being backed up
grows in the future. WAN optimization, multi-threaded data transport, and fast SSD storage in the cloud
datacenters are high-performance technologies that until now have only been available to enterprises
with large IT budgets. Zetta has included these technologies in a complete solution that is affordable for
small and medium size businesses for the first time.
Zetta’s 3-in-1 backup offers the best performance available for faster backups. Pricing starts
at $225 a month, including backup software with all the features described in this technical
brief, 500GB of enterprise-grade cloud storage and 24/7/365 engineer-level support.
Please ask us about how your backups can be done faster.
Get custom pricing.
Start a free trial.
call us: 1-877-469-3882
email us: email@example.com