Home > Backup and Recovery Blog > Microsoft Azure Backup Solutions. How to Backup On Premise Workloads to Azure?

Microsoft Azure Backup Solutions. How to Backup On Premise Workloads to Azure?

1 Star2 Stars3 Stars4 Stars5 Stars
(22 votes, average: 5.00 out of 5)
Loading...
Updated 28th November 2022, Rob Morrison

Backup to the cloud is a nebulous topic. From public clouds hosted by a handful of major players, to private clouds creating clusters of highly available storage within a company’s own data centers, to hybrids of the two and everything in between, the explosive growth of cloud data storage options has naturally led companies to want to back up their data to these same locations.

Backup has its own requirements and considerations, some of which are complicated by cloud data storage principles, and in this article we will discuss how the leading Azure backup solutions and Bacula, address these to allow backup directly to Azure storage. We also explain the basic steps that you have to perform using the built-in backup tool – Azure Backup, which creates backups and uploads them to the cloud.

Built-in Azure Backup Solution

It should be mentioned that Microsoft Azure also offers its own backup-related service called Azure Backup. This service does exactly what it is named after – creates backups of data and transfers them to the Azure cloud. It can work both as an Azure VM backup system and as an Azure backup on-premise.

Azure Backup as a whole is not a particularly complicated system – it uses its own MARS (Microsoft Azure Recovery Services) agent to send data to an Azure Backup Service Vault. After that, this vault then connects to an actual Azure cloud storage account, ready to accept backups as soon as you’re done setting up a backup schedule.

Let us look at this process in a bit more detail. Here is the process of setting up an Azure storage backup, split into two parts – backup vault creation and backup scheduling.

  1. The first part of this process is all about creating a backup vault. To do that, we have to start with logging in to the Azure management portal.
  2. Right after that, you can find a “New” button at the bottom right part of the screen – here, you have to select these commands in the following order: “New – Data Services – Recovery Services – Backup Vault – Quick Create”
  3. The combination of menus above would allow you to input into two important fields: Vault name and Region.
  4. After the vault is created, you have to select it and choose “Download vault credentials” under the first bullet point of the “Protect On-premise workloads” list. This operation saves a credential file to your current system.
  5. Directly under the “Download vault credentials” option, you can see multiple variations of the “Download agent” operation, depending on your device type or backup purpose. Click the one that fits your use case and launch the install wizard.
  6. The installation process in all three cases is fairly straightforward – follow the install wizard’s recommendations to install everything with little to no effort. After the installation process is complete, you will be able to see a pop-up window with a phrase “Proceed to Registration” – clicking on it would continue the entire process.
  7. The first part of the registration process is vault credentials validation – this is where our previously saved vault credentials file comes in, you have to browse it within this same window.
  8. The following step consists of figuring out the encryption settings – you can generate a random passphrase, or input your own. This passphrase is essential for any restoration process of yours. You also have to pick the location that your passphrase would be saved at. Clicking “Next” one final time concludes this half of the process.

Backup scheduling is the second part of this entire Azure Backup setup process, and this one is significantly smaller, too:

  1. The first step here would be to launch the program we have previously installed (if it failed to launch automatically). In this program, you can see an entire panel of options on the right – pick “Schedule Backup” to begin.
  2. The “Schedule Backup Wizard” menu has multiple steps that help you with figuring out what you want to back up, how often to perform backups, and so on. The first step is to pick the backup target.
  3. The second step lets you figure out the frequency of a backup – up to 3 times per day or per week. There are also separate options for backup retention time, as well as many other features. When you’re done figuring out all of the settings for a backup and are out of the backup wizard, you can initiate the backup by clicking the “Back Up Now” button on the right part of the screen.
  4. The same could be done with the recovery process, with the “Recover Data” button being the correct prompt.

Despite its simplicity, Azure VM Backup is a backup solution that offers a rather impressive list of advantages, such as:

  • Easy scalability;
  • App-consistent backups;
  • No data transfer limits;
  • Multiple backup storage locations to choose from;
  • Storage management automatization;
  • Data retention (short-term and long-term) via Recovery Service vaults, and more.

However, Azure Backup is also not the perfect solution on the market, and far from being the only one that can backup data to Azure cloud storage. As such, we can present five best practices when it comes to picking a specific backup and recovery solution (Azure Backup is our current example):

  1. Backup target. The entire plan that describes your backup and recovery efforts – your backup strategy – changes a lot depending on the backup target that you have chosen. There is a massive difference between backing up regular files and folders, and creating app-aware backups of specific instances of VMs. Other variations of backup targets are databases (Azure SQL database backups are one such example), services, all kinds of VMs, and more – and you should figure out if a backup solution supports your backup target before getting it.
  2. Backup performance. Another significant part of any backup strategy is the recovery time – figuring out how much time you can spend on fixing a service or a server (RTO), or how much time you can work without a part of your data (RPO). Both of these depend on a massive number of factors, including backup time, data transfer time, highest possible bandwidth of your channel, and more.
  3. Company’s backup requirements. This is potentially a rather large topic, covering multiple different requirements for a company’s backup needs – and it is also used to figure out most of the other parts of this “best practices” list. It is highly recommended to figure out beforehand what specific features your company needs in terms of backup, including legal requirements, availability requirements, and many others.
  4. Backup price. Pricing is also important when it comes to picking a cloud backup and recovery solution, since it is not uncommon for different service providers to implement different models when it comes to cloud storage space. For example, Azure Backup offers a pricing system that consists of two main components: a variable charge depending on the amount of storage used by the backup, and a flat charge depending on an instance size. As such, it is important to find out what model your backup service is using beforehand.
  5. Recovery process. While backup as a process is important, it is extremely careless to forget about one more part of this – the recovery process. And this part of the service can also vary quite a lot depending on your cloud backup service of choice. For example, Azure backup and recovery system offers you the ability to recover VMs, workloads, system states and individual files, it can also be used to monitor backups, generate reports, run simple diagnostics, and more.

In this context, we should also mention another similar service – Azure Site Recovery. It is another highly specific service that provides a comprehensive Microsoft Azure disaster recovery feature set for your data, including copying VMs, enabling failover, and so on.

Cloud backup – advantages and complications

The advantages of cloud storage for backup are very clear. High availability, high reliability storage with low upfront cost, on-demand growth, and ease of moving data offsite, while increased upload bandwidth and cloud processing have given new avenues to take advantage of cloud storage. However, complications remain such as the amount of bandwidth required, increased complexity, management of stored data to keep costs inline, and potentially very long restore times should disaster strike your primary site.

Restore times are one of the many reasons why it is so important to have a functional backup system in the first place. Veeam’s Data Protection Report (2021) shows that the average downtime cost is approximately $1,410 for each and every single minute (adding up to a staggering $84,650 per hour). It would be fair to say that these results can be somewhat conflicting due to bigger organizations reporting far higher potential losses, but the message is still the same – downtime is a massive financial problem for any business.

Additionally, one of the biggest reasons for downtime in the modern world – ransomware – generates a massive downtime period of over 16 days on average, according to a Coveware study (2021). This kind of result, combined with the average downtime costs from the source above, is already a great reason for any company to invest into a comprehensive backup system.

Microsoft Azure backup solutions for Azure instances

As we have mentioned before, Azure Backup is not the only solution to offer backup services for multiple different data storage locations (including Azure cloud itself). As such, we have several more examples of different services with similar functionality:

  1. Veeam Backup. As with most of the popular backup and recovery solutions, Veeam Backup is a popular backup system provider that supports a multitude of different backup locations and storage providers – this includes Azure data backup, as well. Veeam is capable of working in tandem with Azure Backup, taking a copy of the data directly from the Azure Backup Vault. As such, Veeam can monitor Azure Backup’s data copy as the primary one without recreating its own copy, offering better response times and better performance as a whole, combined with many useful features of Veeam.
  2. Veritas NetBackup.Expanding upon the topic of multifunctional backup solutions with Azure support, Veritas makes this list, offering an automated feature-rich resiliency platform with a multitude of data protection options. It supports a good number of different storage types and locations, can work with both regular data and virtual machines, offers workload migration capabilities to and from Azure, and has a centralized console for all of its operations. The large choice of capabilities that Veritas has well justifies its place on this list of Azure backup solutions.
  3. Vembu BDR. Available as both Azure backup on-premise and cloud service, Vembu BDR is another example of a popular multifunctional backup solution that can also work with Azure data. Vembu offers multiple different options when it comes to Azure, be it a backup to another account, a backup to a different location, and many others – combined with a user-friendly interface and responsive customer support.
  4. Cohesity DataProtect. Simple yet effective data protection options is what Cohesity DataProtect is known for, offering good and different features for the best user experience possible. It is a comprehensive data platform designed to defend its users against ransomware attacks. It helps with mitigating legacy infrastructure issues and has a comprehensive policy-based protection for your servers. As for the Azure-specific features, there are cloud-native backups, direct backups, dev/test capabilities and many other options that make Cohesity a strong contender among other Azure backup solutions.
  5. Druva inSync. User data tracking and monitoring is one of many things that Druva inSync excels at, offering multiple workloads with unified data protection. It can perform backups, monitor data for compliance purposes, enhance overall data visibility, and so on. Other features include automatic user provisioning for Azure, cross-platform capabilities, AD integration and capabilities to address different backup types. The ability to centralize your backup and restore operations for the entire company is also extremely useful.
  6. Acronis Cyber Backup. Acronis is a well-known name in backup, so it is no surprise that Acronis Cyber Backup also supports Azure VMs. It can perform app-aware backups, can work in tandem with Azure Backup, can create its own copy alongside Azure Backup one, and more. Acronis Cyber Backup can get a bit overwhelming at times, but it definitely is not lacking in features.
  7. NAKIVO Backup & Replication. NAKIVO puts quite a lot of emphasis on being user-friendly and easy to deploy. When combined with a wealth of different features and use cases, this kind of approach to data protection makes it a well known solution on the market. NAKIVO can perform backup and recovery operations, provide ransomware protection, easy workload scheduling, granular instant recovery and helpful technical support. A feature of NAKIVO that makes it a great addition to this list of Azure backup solutions is its capability to use Azure as a storage location, among many other sources (cloud or otherwise).
  8. Dell Technologies. As a long-standing partner of Microsoft in multiple fields, Dell also has a number of solutions dedicated to Microsoft Azure backup on premise and remotely. One such solution is Dell PowerProtect Cyber Recovery – a combination of a backup solution and a data security system that offers a plethora of different features to its users. It can be deployed extremely fast, is capable of automatically creating copies of existing data to isolate it in a secure digital vault of sorts, and there are also features such as data governance support, dynamic restore capabilities, expert guidance from Dell Technologies’ own specialists, and more.
  9. Rubrik Cloud Data Management. A unified software solution that promotes ease of use in many different cases is what Rubrik is known for, providing an API-first architecture with support for many types of storage, operating systems, cloud storage providers and applications. It is a strong Azure backup solutions for protecting data from ransomware, making it easy to transfer workloads, accelerates cloud adoption and enables advanced data protection across many different applications and storage locations – which is why Rubrik belongs to the Azure backup solutions list.
  10. NetApp Cloud Volumes ONTAP. As you may have seen, a lot of backup systems tend to try and work with the existing Azure Backup functionality. NetApp follows a slightly different path, offering its own take on a backup system that claims to work better and cost less than the Azure Backup. It can work with Hyper-V and VMware instances within Azure, can perform cloud-to-cloud backups, offers a secondary backup, and more.
  11. Datto Continuity. Although discussed as a backup solution in this use-case, Datto Continuity is a solution that focuses more on disaster recovery and business continuity. It is on the list of Azure backup solutions due to its alternative capabilities in terms of Azure VM backups – securing processes and information, rather than storing actual data. It works by pulling copies of Azure data to its own servers on a regular basis, providing its users with the ability to restore any process that suffers an error or loses its data, improving overall business continuity.
  12. DSP-Explorer. Still on the topic of less usual approaches to backup solutions and services, DSP-Explorer is not even a dedicated backup service – yet it is on this list of Azure backup solutions because it offers expertise and professional help. DSP-Explorer is a team of experts in the field of managing Microsoft Azure backups, and it is a great choice for smaller businesses that cannot afford to have an entire IT department to manage all of their Azure backup operations. However, it is worth noting that this kind of service also means that your company’s security is in the hands of a third party.
  13. Bacula Enterprise. This is a high-end, highly scalable backup solution that is designed for medium and large enterprises, where a high level of dependability and security is required.  As such, it offers support for many different storage locations and services, including of course Azure. You can use it to manage multiple accounts, work with other cloud storage providers, develop an advanced hybrid cloud strategy, and deploy special configurations that enable cached, speeded up restore of specific files, application and other data – making it a great Azure backup on premise solution. Bacula also backs up Azure VM’S, has tiers support, and offers some of the highest security levels in the industry.

Bacula Enterprise Cloud Plugin is something that we are going to look into in a bit more detail.

Bacula Enterprise and Azure

The Bacula Enterprise Cloud Plugin for Azure has been designed to help take advantage of cloud storage while helping mitigate some of the traditional disadvantages of backing up large amounts of data to the cloud.

The Bacula Enterprise Cloud Plugin for Azure allows a Bacula Storage Daemon to write directly to Azure storage. If you’re not familiar with the Bacula Enterprise Architecture, please see the architecture overview here: Once configured, https://www.youtube.com/watch?v=EXTo2Hj8RU8 the Azure storage blob becomes part of your Bacula Enterprise storage and can be used by any backup job. Bacula also uses a local cache, configurable to whatever size is desired, which helps optimize upload and storage usage so that data can be recovered with minimal access fees from the cloud provider. In addition, the cache can be configured to act as a first line restore point for your most recent backups, allowing for fast recovery of the data before it is moved to the cloud on your schedule. The caching feature helps us to overcome some of the concerns associated with cloud backup and to give the administrator the most control over their storage.

Now let’s move on to the configuration ‘how to’ for both your Azure portal and Bacula Enterprise Edition.

Azure portal configuration How-to (with screenshots)

1. Log into your Azure portal, click “Resource Groups” and then click “+Add” to create a new Resource Group.

2. In the Microsoft Azure Dashboard, in the far left menu, click “Storage accounts”.

3. After clicking on “Storage accounts” click the “+Add” button in the upper part of the window to add a new BlobStorage.

4. Give it a name; Must be lowercase letters and numbers only, and must be unique in all of Azure In the “Account kind” field select “Blob storage” For “Resource Group”, click “Use existing”, then choose the Resource Group from the drop-down Click on “Create” to save. This may take a minute to deploy. If you didn’t create a Resource Group in the beginning, you can also create a new one on the fly in this step of the Azure preparation.

5. In the far left menu, click on “Storage accounts”.

6. Click on the newly created BlobStorage Account.

7. Click “Settings –> Access keys” and here you find the “Storage account Name” which is what use for the “AccessKey” directive in the Bacula Cloud resource.

8. Under the “Key1” bold heading, the “Key” listed here is the “SecretKey” that we use in the Bacula Cloud resource.

9. This step is not 100% necessary, Bacula will create a Blob Container if it does not exist. Click on BLOB SERVICE –> Containers, click the “+Container” at the top to create the new container. This will be the “BucketName” that we use in the Bacula Cloud resource. Give it a name, and select “Blog” for the “Public access level” drop-down. Click “OK” to create this new BlobContainer.

10. Confirm that the Settings in the Azure portal which were just created/edited match the Bacula Cloud resource config file (which we will create in the next part of this tutorial, see below).

Configuration of Bacula Enterprise Edition

Once the Azure side has been prepared you will need to create a few new Bacula resources in your configuration files. We will need a Storage{} resource (or Autochanger{}) on the Bacula Director to point to our Azure storage that we are going to create on the Bacula Storage Daemon:

Storage {
Name = “Azure-storage”
Address = “bacula-storagedaemon.yourdomain.org”
SDPort = 9103
Password = “xxxxxxxxxxxxxxxxxx”
Device = “Azure-CloudDevice”
Media Type = “AzureVolume”
Maximum Concurrent Jobs = 10
}

and on the Storage Daemon we will configure a new Device{} resource (or an Autochanger{} resource with more than one device) and the Cloud{} resource that will point to your Azure cloud storage target. The Cloud{} resource is available with the Cloud Plugin for Azure for Bacula Enterprise Edition.

This is what the Storage Daemon configuration resources look like in our example configuration:

Device {
Name = “Azure-device”
DeviceType = “Cloud”
Cloud = “Azure-cloud”
Media Type = “AzureVolume”
Archive Device = “/opt/bacula/volumes/azure”
LabelMedia = “Yes”
Random Access = “Yes”
AutomaticMount = “Yes”
RemovableMedia = “No”
AlwaysOpen = “No”
Maximum Concurrent Jobs = 5
}

Cloud {
Name = “Azure-cloud”
Driver = “Azure”
Hostname = “dummy – not used for Azure”
BucketName = “bsyscontainer”
Accesskey = “bsystest”
Secretkey =
“xnligNkNEUpFKLvGP8g9itZujF4ssjZMl9Ya9XE53qqy3a0zhgjxzhIH5N19XKhLcpfifnUkQW258MzCKEBA==”
Protocol = “https”
UriStyle = “Path”
Upload = “EachPart”
}

 

Final touches

On the Bacula Director a “reload” of the configuration will be sufficient once you have tested the configuration syntax with the “-t” option of the bacula-dir service (test syntax option).

On the Storage Daemon, you will have to restart the daemon (see below). Please make sure that no backup jobs are running on the Storage Daemon at the time of the restart of this service:


Stop the bacula-sd daemon:
# systemctl stop bacula-sd

Test the SD config syntax:
# /opt/bacula/bin/bacula-sd -u bacula -t

Start the bacula-sd daemon:
# systemctl start bacula-sd

 

Now you are ready to go and do backups to your Azure storage in the cloud. Just create a new backup job, or modify one of your existing backup jobs to use the new Cloud Storage/Autochanger instead of local disk or tape storage targets. The restore process will recover your data from the cloud transparently onto any Bacula File Daemon (client) that is available to the director.

 

About the author
Rob Morrison
Rob Morrison is the marketing director at Bacula Systems. He started his IT marketing career with Silicon Graphics in Switzerland, performing strongly in various marketing management roles for almost 10 years. In the next 10 years Rob also held various marketing management positions in JBoss, Red Hat and Pentaho ensuring market share growth for these well-known companies. He is a graduate of Plymouth University and holds an Honours Digital Media and Communications degree, and completed an Overseas Studies Program.
Leave a comment

Your email address will not be published. Required fields are marked *