Very few people actually realize how much data they are storing on their local computers. What would happen if your computer crashed or got a virus and all your data was lost? Would you miss it? Many of us never think to backup our data until it is lost.

Users that do perform regular backups usually store copies locally or in the same physical location on a CD/DVD/USB/TAPE. This type of backup is a good start, but if something were to happen to that computer, disk drive or location, all their data, including all their backups would be lost.

To avoid having this happen to you, it is important to not only backup your data locally but to also have a copy of your data offsite in the “cloud”.

What is cloud based backup?

Cloud backup is different from Cloud Computing. Cloud backup refers to a remote data center, where your data is stored and protected from unforeseen events, like natural disasters, viruses or even hardware malfunction.

Why use cloud backups?

1. In the United States alone, businesses lose an average of 12 Billion per year due to data loss.

2. 70% of businesses that have had major data loss, go out of business within 24 months.

3. Having your data encrypted, conforms to regulations such as HIPPA.

4. Hard drives have an average life span of 3–5 years.

5. All CD/DVD disks will fail

The ROBOBAK Cloud Backup Solution provides a very simple approach to cloud based data protection. ROBOBAK will give you peace of mind knowing that your data will be available anytime and anywhere you are, by simply using your web browser or Executive Backup Client agent to restore a file(s) no matter where you are! This will ensure that you’re always protected because as Murphy’s law states: “Anything that can go wrong, will go wrong”.

As we celebrate the ending of 2010, we would like to embrace the opportunity of wishing all of our partners a safe and joyous holiday season and a healthy and prosperous New Year!

This past year has been a very successful and exciting year for our team. 2011 will be a year filled with excitement and major strides for technology, innovation and new partnerships. 

Happy holidays from the ROBOBAK Team

On behalf of the entire ROBOBAK team, here’s to a wonderful end of the year for all, and we look forward to continuing successes with you in 2011! Cheers, The ROBOBAK Team

If you didn’t believe all the noise about cloud computing in the enterprise before, perhaps the latest news will help you change your thinking.

A new survey from Morgan Stanley reveals that 70% of CIOs questioned plan to move some, or all, of their infrastructure to the cloud in 2011. That’s almost twice as many as those who answered positively heading into 2010.

Moving to the Cloud

The 451 Group estimates that the cloud-computing market will grow to $16.7 billion in revenue by 2010, with a projected CAGR of 24% over the next three years.

The Revere Group predicts “consumers will also drive growth of this technology with their continued rapid adoption of smart-phones and tablets, which increasingly rely on cloud computing for their email and other application services.”

And that doesn’t even count Google’s forthcoming Chrome OS, which runs entirely in the cloud.

I’m proud to say that our data protection solution, ROBOBAK, have been at the forefront of this movement for many years. From our perspective, it simply makes sense, and has for quite a long time. If you’re a company with multiple locations, the ability to safeguard your data offsite is a virtual no-brainer. Depending on the sensitivity level of your data, you could store that data in a public cloud, a private cloud, or use a hybrid approach. In any of the three options above, your integrator/reseller/solution provider can help you figure out which is best for you.

This trend is only going to grow. Recent data from IDC shows we’re saving everything…because we can, and (in many cases) because the lawyers increasingly say we must. So we need a place to store the data that can easily grow with our rapidly changing requirements. That place, obviously, is in the cloud.

A couple of quick caveats: as you’re setting up your cloud storage infrastructure, make sure you work with your provider to establish an easy-to-follow and concise disaster recovery plan. The last thing you want is to suffer a data disaster, only to realize you’re not quite sure where it is or how to get it back. Also, make sure that you know what data you need protected and that these files are actually being backed up (See the following Blog piece from Dec 9, 2010 titled: [“Protecting what is near and dear to my wife, her pictures”] to find out how ROBOBAK template backups can help).

All off this is happening in the cloud. It’s a place where ROBOBAK has been for several years, and we’re pleased to see that the market is finally catching up.

Thanks to the relatively recent development of Deduplication technologies, US DataVault is not only more efficient for our Clients, but more profitable as well.

For the newcomer, Deduplication eliminates redundant backup and archive data at the client before sending it across networks to servers and storage. Users can easily extend deduplication reduction globally, across thousands of clients having different retention requirements. This all adds up to shorter backup windows, lower management costs, and more robust, reliable data protection. Typical benefits of Deduplication include:

– Cut backup window by finishing jobs faster.
– Transfer significantly less data across the network.
– Maximize data reduction by deduplication data globally across hundreds of clients in different locations with different protection policies.
– Easily and cost effectively scale out with open systems hardware.
– Cover both small and large data sets in remote offices and large data centers.
– Cost efficiency
– by reducing the data footprint, storage costs are cut, sometimes dramatically, with no loss of recoverability on any piece of data.

Deduplication is frequently viewed by some data managers as a quick fix for complex protection or recovery issues arising from rapid data growth, shrinking operating windows, and very aggressive recovery SLAs. Those who use deduplication as a band-aid, especially in form of an appliance, quickly realize they have only temporarily addressed data reduction on disk because as it grows, they face additional hardware and operational costs as more and more appliances must be pressed into service. But more important, administrators most often find they have made NO impact on reducing backup times or overall costs.

Almost always there is a significant gap between the short-term fix to explosive data growth and achieving REAL and meaningful improvement. Improvements that are found in higher performance, smaller operating windows, faster recovery, lower costs, scalability, security, minimized management, and relative ease of use.

Real improvement requires an approach that is not an afterthought, but rather an integrated part of any data management solution. Thanks to ROBOBAK’s deduplication initiative, USDV was among the early providers of backup services to offer fully integrated deduplication in a single software solution, with NO required hardware.

ROBOBAK’s leadership in this area, expands deduplication benefits to address real bottlenecks by bringing efficiencies as close to the source as possible. The current version of US DataVault Backup Pro uses deduplication as a means to reduce data thus using fewer network resources, dramatically shortening backup windows, and eliminating the lock-in associated with appliance based approaches.

If there is one thing that my wife is, it’s a picture fanatic. She literally takes hundreds of pictures a month! If you don’t believe me, ask my family, friends, and Facebook contacts.

My job of course is to keep these pictures safe! This job is considerably harder though because she stores her pictures everywhere. I can never seem to figure out why it’s such a hard concept to simply store all pictures in one place, the most obvious place, being the My Pictures folder.

Keeping pictures stored everywhere on a computer makes my backup job a littler harder, but there are a few different ways I can go about getting this done.

I could backup the entire PC, including files I don’t need to protect. This strategy is pretty simple but the consequences are high. Most users would avoid backing up the whole computer just to capture the images for the following reasons:

– Bandwidth is being used backing up things you don’t need.
– Extra offsite storage space is needed, and hence the costs associated with storing so much data, is high.
– Backup time is longer.
– Restoration becomes difficult since the pictures are still hidden like a needle in a haystack.
– Restoration time is longer.

I could also spend hours manually selecting all of the picture folders. But this method is prone to problems because:

– It is a waste of my time.
– I will probably end up forgetting some locations.

– It won’t backup newly added files my wife adds at a later date. There is a better way than the full PC backup though. The answer is a technology included in ROBOBAK called Smart Crawl Template Backup. Smart Crawl Template backup utilizes file type template search to selectively find what to backup based on the type of the file. Think of it like an automatic picture search engine for backup. Template backups don’t end at pictures though; common patterns include office documents, music, and source code to name a few. Each template can be applied to backup:

– All files matching the template.
– All files except for the files matching the template.The best part about these Smart Crawl Template Backups is that you can specify a Template at the entire network level, effectively backing up all computers no matter which computer my wife happens to be on when she’s downloading her photos!

Most people can easily associate compression and deduplication in their mind, after all they have 2 common goals:

1.To reduce bandwidth going from the client to the backup server
2. To reduce storage space once the data resides on the backup server’s storage volume

Both deduplication and compression work alongside each other to accomplish the same goals, but both of them are also independent. Compression does not affect your deduplication ratios, and deduplication does not effect your compression ratios. Backup providers will often combine both ratios for an overall “benefit ratio” for your convenience.

Encryption and deduplication on the other hand hold an even tighter bond. A bond which directly effects your deduplication protected to storage ratio. To understand this bond though, one must first understand how deduplication works in general.

Deduplication works by creating a hash (or checksum) on each block of data being backed up, before encryption and compression is performed. Existence of a block of data on a backup server can be checked by doing a lookup to see if the hash already exists on the storage server. If a particular hash already exists on a storage server, the block of data does not need to be re-transferred nor re-stored to disk. Instead a reference to that data will simply be created instead.

Each company that backs up has access to only the files they backup; however, some companies require extra security. These companies want to hold the encryption key themselves, and never communicate it across their computer to the backup server. The files on the backup server storage disk must be encrypted with an encryption key that only that particular customer knows of. Since the server storage does not know the encryption key, this makes deduplication impossible for that particular block of data across customers.

,The Intimate Relationship of Data Deduplication and Encryption

ROBOBAK by default allows vault wide deduplication. Vault wide deduplication means that your deduplication ratios of protected vs. stored can be huge. We accomplish this by randomly encrypting each block of data being backed up with a unique encryption key, and then associating each user access to each of these unique keys.

ROBOBAK also recognizes that some companies require that the encryption key used to encrypt the files being backed up is only known to them. This gives a customer extra piece of mind, because by definition, only they can restore from these files because the backup server does not even know what the encryption key is.

In the preferences of the backup client a user can change the default to use a customer and user encryption key which is only known to them. When using these encryption keys though, deduplication ratios will not be vault wide. ROBOBAK also allows you to allow and disallow subsets of this functionality and fill defaults via its free customization kit utility.

When most people think of cloud storage, they imagine an infinite horizon of bits and bytes, a vast, limitless space in which to store their data. But cloud storage is sometimes not the avarice that people imagine. For applications that have heavy disk I/O activities or transfer a lot of data, cloud storage can become a significant cost and inherently limiting.

But there is a solution to reducing high I/O activities and unnecessary use of bandwidth with cloud storage—data de-duplication which is described as the process of reducing duplication in data transfers prior to transfer. This technology works by comparing differential changes in data or files before transferring the data to its destination. By employing a data de-duplication method, companies can significantly reduce costs associated with application storage in the cloud; first, by transferring only files that have changed thereby saving on bandwidth costs; second, by reducing I/O on the storage cluster and storing only what is changed.

What is Source Based Data De-Duplication?

 

Yet not all data de-duplication solutions are created equal. In fact, in many cases, the implementation of the technology is far from efficient. Although there is some benefit gained in only transferring files that have changed, if those files are significantly large such as with geographic survey images that may be 100s of gigabytes in size, there is little gained from file level de-duplication.

That’s what makes a solution such as ROBOBAK so powerful. Rather than transferring whole files with any sort of change, ROBOBAK focus on actual blocks of data, merging the incremental blocks back into a file only at restore time. For example, let’s say there was a 1MB file that needed to be transferred because it had changed. ROBOBAK would reduce the file to four ~250KB blocks and look for differential changes within the blocks, transferring only those blocks that have changed rather than the whole file. This provides an even greater efficiency over other data de-duplication technologies by enabling further savings on bandwidth and disk I/O. ![alt text][1]

Data de-duplication is a perfect technology for companies wanting to utilize the cloud for data storage because its inherent operation is meant to save on bandwidth and storage by transferring or storing only those files that have changed. But the savings are only as good as the solution and none are better than ROBOBAK…block-level data de-duplication technology for your business cloud based backup strategy.