Home -> Enterprise Data Backup and Recovery Methods Blog -> How to Execute Azure Backup? Azure VM Backup and Disaster Recovery with Bacula.
1 Star2 Stars3 Stars4 Stars5 Stars
(22 votes, average: 5.00 out of 5)
Loading...

How to Execute Azure Backup? Azure VM Backup and Disaster Recovery with Bacula.

  • July 27, 2018

Bacula Enterprise Cloud Plugin for Azure - Technical Overview

Backup to the cloud is a nebulous topic. From public clouds hosted by a handful of major players, to private clouds creating clusters of highly available storage within a company's own datacenters, to hybrids of the two and everything in between, the explosive growth of cloud data storage options has naturally led companies to want to back up their data to these same locations. Backup has its own requirements and considerations, some of which are complicated by cloud data storage principles, and in this article we'll discuss how the Bacula Enterprise Cloud Plugin for Azure addresses these to allow backup directly to Azure storage. We'll also explain the basic steps that you have to perform inside your Microsoft Azure portal and on the locally installed Bacula Enterprise Storage Daemon to get this type of backup running in no time.

The advantages of cloud storage for backup are very clear. High availability, high reliabilty storage with low upfront cost, on-demand growth, and ease of moving data offsite, while increased upload bandwidth and cloud processing have given new avenues to take advantage of cloud storage. However, complications remain such as the amount of bandwidth required, increased complexity, management of stored data to keep costs inline, and potentially very long restore times should disaster strike your primary site.

Fortunately, the Bacula Enterprise Cloud Plugin for Azure has been designed to help take advantage of cloud storage while helping mitigate some of the traditional disadvantages of backing up large amounts of data to the cloud.

The Bacula Enterprise Cloud Plugin for Azure allows a Bacula Storage Daemon to write directly to Azure storage. If you're not familiar with the Bacula Enterprise Architecture, please see the architecture overview here: Once configured, https://www.youtube.com/watch?v=EXTo2Hj8RU8 the Azure storage blob becomes part of your Bacula Enterprise storage and can be used by any backup job. Bacula also uses a local cache, configurable to whatever size is desired, which helps optimize upload and storage usage so that data can be recovered with minimal access fees from the cloud provider. In addition, the cache can be configured to act as a first line restore point for your most recent backups, allowing for fast recovery of the data before it is moved to the cloud on your schedule. The caching feature helps us to overcome some of the concerns associated with cloud backup and to give the administrator the most control over their storage.

Now let's move on to the configuration 'how to' for both your Azure portal and Bacula Enterprise Edition.

Azure portal configuration How-to (with screenshots)

1. Log into your Azure portal, click "Resource Groups" and then click "+Add" to create a new Resource Group.

2. In the Microsoft Azure Dashboard, in the far left menu, click "Storage accounts".

3. After clicking on "Storage accounts" click the "+Add" button in the upper part of the window to add a new BlobStorage.

4. Give it a name; Must be lowercase letters and numbers only, and must be unique in all of Azure In the "Account kind" field select "Blob storage" For "Resource Group", click "Use existing", then choose the Resource Group from the drop-down Click on "Create" to save. This may take a minute to deploy. If you didn't create a Resource Group in the beginning, you can also create a new one on the fly in this step of the Azure preparation.

5. In the far left menu, click on "Storage accounts".

6. Click on the newly created BlobStorage Account.

7. Click "Settings --> Access keys" and here you find the "Storage account Name" which is what use for the "AccessKey" directive in the Bacula Cloud resource.

8. Under the "Key1" bold heading, the "Key" listed here is the "SecretKey" that we use in the Bacula Cloud resource.

9. This step is not 100% necessary, Bacula will create a Blob Container if it does not exist. Click on BLOB SERVICE --> Containers, click the "+Container" at the top to create the new container. This will be the "BucketName" that we use in the Bacula Cloud resource. Give it a name, and select "Blog" for the "Public access level" drop-down. Click "OK" to create this new BlobContainer.

10. Confirm that the Settings in the Azure portal which were just created/edited match the Bacula Cloud resource config file (which we will create in the next part of this tutorial, see below).

Configuration of Bacula Enterprise Edition

Once the Azure side has been prepared you will need to create a few new Bacula resources in your configuration files. We will need a Storage{} resource (or Autochanger{}) on the Bacula Director to point to our Azure storage that we are going to create on the Bacula Storage Daemon:

Storage {
Name = "Azure-storage"
Address = "bacula-storagedaemon.yourdomain.org"
SDPort = 9103
Password = "xxxxxxxxxxxxxxxxxx"
Device = "Azure-CloudDevice"
Media Type = "AzureVolume"
Maximum Concurrent Jobs = 10
}

and on the Storage Daemon we will configure a new Device{} resource (or an Autochanger{} resource with more than one device) and the Cloud{} resource that will point to your Azure cloud storage target. The Cloud{} resource is available with the Cloud Plugin for Azure for Bacula Enterprise Edition.

This is what the Storage Daemon configuration resources look like in our example configuration:

Device {
Name = "Azure-device"
DeviceType = "Cloud"
Cloud = "Azure-cloud"
Media Type = "AzureVolume"
Archive Device = "/opt/bacula/volumes/azure"
LabelMedia = "Yes"
Random Access = "Yes"
AutomaticMount = "Yes"
RemovableMedia = "No"
AlwaysOpen = "No"
Maximum Concurrent Jobs = 5
}

Cloud {
Name = "Azure-cloud"
Driver = "Azure"
Hostname = "dummy - not used for Azure"
BucketName = "bsyscontainer"
Accesskey = "bsystest"
Secretkey =
"xnligNkNEUpFKLvGP8g9itZujF4ssjZMl9Ya9XE53qqy3a0zhgjxzhIH5N19XKhLcpfifnUkQW258MzCKEBA=="
Protocol = "https"
UriStyle = "Path"
Upload = "EachPart"
}

 

Final touches

On the Bacula Director a "reload" of the configuration will be sufficient once you have tested the configuration syntax with the "-t" option of the bacula-dir service (test syntax option).

On the Storage Daemon, you will have to restart the daemon (see below). Please make sure that no backup jobs are running on the Storage Daemon at the time of the restart of this service:


Stop the bacula-sd daemon:
# systemctl stop bacula-sd

Test the SD config syntax:
# /opt/bacula/bin/bacula-sd -u bacula -t

Start the bacula-sd daemon:
# systemctl start bacula-sd

 

Now you are ready to go and do backups to your Azure storage in the cloud. Just create a new backup job, or modify one of your existing backup jobs to use the new Cloud Storage/Autochanger instead of local disk or tape storage targets. The restore process will recover your data from the cloud transparently onto any Bacula File Daemon (client) that is available to the director.

 

Leave a comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>