Hi All,

Just looking for a little outside advice on backups and viability.

I’m working in the IT team at a small, young company at the moment. We’re in the business of large image files and reams of image analysis so we use a lot of storage.

Not counting the individual workstations in the office, our fileserver is using around 20TB at the moment and that number will grow and grow.

I’m constantly being hounded by the execs that we should be backing up all our data offsite, and no matter how much I tell them about our inhouse failovers and RAIDs and all that good stuff… they won’t drop the idea of “if the office burns down we’ll lose everything”.

Is there a glorious solution I’m missing? And how do other big companies store a butt-ton of data offsite? Or do they not?

Thanks all!

Tayler

8 Spice ups

To be honest, they are absolutely correct. You should have a offsite backup copy of your data. Whether you do tape, drives, or cloud, you need it. Do it.

1 Spice up

Kinda builds into the DR/BC stratagy… if the building did burn down what would you do to recover and how much would is cost in revenue/stock etc

You’d have problems getting those pictures back for a start.

How long can you be down before it affects customers and revenue?

I have to agree with the execs on this one, you need to get backups, and get them offsite. You need to sit down with them and attach a dollar value to all of that data to establish a budget for quality backup software and storage. Sounds like you have the opposite problem that most of us do, and that management is pushing this so budget should be easier. Work with them, attach costs to downtime for recovery, corrupt data, etc. Find out how often the data in question is changing.

More than just “the bomb dropped on the office” scenario, you are one clicked email and locky infection away from a nightmare. Backup often, test those backups often.

2 Spice ups

Thanks for the advice, way to make my life more difficult :wink:

You’re right, and while I’m not denying the execs have a point, my concern is with how.

I mean, do you recommend physically taking a bag full of tapes off site every month? Or using something like AWS for backing up?

If you have the bandwidth AWS Glacier can be very price advantageous, you can even send them a seed disk(s) via courier to get the data going then just drip feed in changes every day/hour etc

1 Spice up

Look at AWS, especially Glacier, for files that won’t be accessed often.

Companies like Cloudberry make it easier to perform the backups. You also have the choice to ship drives to AWS to make the initial backup much faster.

Pretty sure you can’t ship them tapes though :slight_smile:

I tell you, I underestimated how much of a pain working with huge amounts of data is… I’m used to a hand full of users who basically just send emails all day!

What have I got myself into :slight_smile:

If you need some experienced help PM me and I can introduce you to some folks we use for infrastructure consultancy. this amount of data isnt that unusual these days

oh and RAID5, I’d look deeper into that with the issues of larger modern drives and volume rebuild issues.

Thanks Martin, I really appreciate that.

If you have low change rate, you might be able to seed an offsite backup. Also, like Martin said, your data pipe is going to be important in determining any solutions that way. Do you have a second location perhaps that you might use? Are you actually using all of that data? Possibly determining production data from archived data can help you lower the amount you move offsite daily.

Hey Tayler,

Give a try to CloudBerry Backup.

It allows to push your data to a number of storage providers such as Amazon AWS, Azure, Google Cloud, Oracle Cloud, Rackspace, Verizon, BackBlaze and so on and so forth.

Server Edition MSP360 Backup Products For MSPs and Businesses of All Sizes

For fIleservers you’ll have to use Ultimate edition (former enterprise): MSP360 Backup Products For MSPs and Businesses of All Sizes

If any questions arise feel free to contact me directly.

Thanks

1 Spice up

Taylor,

It is a good idea to have a copy of the data offsite. You may want to consider Asigra Cloud Backu p as it is an agentless cloud based backup and recovery software solution that backs up data residing on physical and virtual servers, enterprise databases and applications, workstations, laptops, Office 365, Google Apps and Salesforce.com. You can keep a local copy and an offsite copy.

Here is a couple of case studies demonstrating how the software can protect large amounts of data across multiple sites:

http://www.asigra.com/sites/default/files/resource_center/case-study-georgia-tech.pdf

Key features of the software:

  • Agentless
  • Hardware Agnostic
  • Supports Windows, Linux, Unix, Mac, iOS and Android
  • Supports VMware and Hyper-V
  • Uses AES 256 Encryption of data inflight and at rest
  • NIST FIPS 140-2 Certified
  • Incremental Forever, Global Deduplication, Compression and Continuous Data Protection
  • Autonomic Healing and Restore Validation

Please let us know if you would like to schedule a product demo.

1 Spice up

If you dont have the data offsite you arent following the basic 3-2-1 backup principle

3 copies of data, 2 different types of media, 1 offsite

So if the place does burn down what are you going to do? The business will almost certainly fail

We offsite our servers (a little bigger than your size) to another location via private 1Gbps link, Veeam replicates the VMs and then we offsite to another location using Shadowprotect. The first backup was seeded using a large storage device (never underestimate the bandwidth of a big JBOD in a van) then incremental differences take no time at all

As you are starting at 20TB seeding is your best option. A JBOD NAS type solution is pretty easy and cheap and then you can use it again for something else

4 Spice ups

Taylor, I agree with the execs, getting the data offsite is critical. At your size data footprint I would recommend a solution that allows you to “tier” the data by moving the older data to a low cost archive. That will be the only practical way to handle large data growth.

I have written a blog posting that addresses that very issue and would encourage you to read it. PM me if you wish to discuss further.

Well I think you should agree with them. And I do not think so that this is something difficult as you can have a lot of options for this. You can give a try to private cloud in which your data will remain into your own network.

Tayler,

Yes, you should have an offsite backup along with on-premise. Normally all companies keep a copy of backup data on-premise and another copy to offsite ie to any cloud. On disaster, they try to recover from onpremise first.

Most of backup software offers offsite backup too in the form of SaaS solution or replication to cloud from onpremise backup server. You may choose second option which has two copies of backup data.

Vembu Network backup offers the second solution, where you can keep the backup data in your on-premise NAS, and replication to cloud ( AWS ).

Get more details and features from :

There’s lots of options out there, cloud, offsite to another office or rented rack space, tapes, external drives.

Look the budget, your internet pipe and connections to other locations and put together a plan that a) covers you for a disaster and b) doesn’t overly complicate the process.

Depending on your growth and change rate cloud would seem like a good option.

I would agree with Denis above, look at whether the data can be split into active and non active. We do this, once projects are finished they are either archived and taken offline or archived and then moved to a read only storage area. This way we only continually back up the data that is current, keep the primary storage costs down and have a more relevant data set to restore should a disaster occur.

1 Spice up

Toby is correct (again)
3-2-1 model at all times.