Backing up a database in Azure Storage
In previous articles on migrating to cloud-based SQL Server, we looked at various options for transferring a database backup to the Cloud. Just in case, you should immediately recall that we are talking about the IaaS approach, i.e. Deploying SQL Server on a Windows Azure Virtual Machine The alternative PaaS approach is fundamentally different, because Windows Azure SQL Database (SQL Azure) does not support functionality in the form of regular T-SQL backup / restore commands. There is also no way to disconnect and attach database files (detach / attach). Other methods should be used there, such as DAC, BCP, SSIS, SQL Azure Sync, etc. In this article, we continue to review the methods related to IaaS, and since there is no problem making / raising a backup in it, the main working moment is to optimally transfer the backup copy of the database to the Cloud and vice versa. In this regard, it is no different from downloading or downloading any blob to / from Azure Storage. We examined the use of the Blob Service REST API (http://msdn.microsoft.com/en-us/library/dd135733.aspx), which is very simple in its idea, but rather painstaking to implement, since it requires a neat PUT body request. The wrapper provided by the Azure SDK (http://msdn.microsoft.com/en-us/library/dd179380.aspx), which includes ready-made classes that encapsulate the preparatory work that needs to be done manually if using " raw "REST API. Finally, we examined the process of transferring a backup to a separate vhd-disk, which is attached to the cloud virtual machine. With the release of Cumulative Update 2 for SQL Server 2012 SP1, this process has been further simplified since Now it is possible to create backup copies of the database directly in Azure Storage using regular T-SQL commands and, accordingly, recover from them. All non-SQL Server backup overhead transfer to the Cloud is built into T-SQL commands. We only need to have a storage account (Windows Azure Storage Account), which will be used for intermediate backup storage.
We examined the process of creating a storage account in an article on the Blob Service REST API (http://blogs.technet.com/b/isv_team/archive/2012/10/25/3528566.aspx) and we will not dwell on this in detail now. Let me remind you just in case that it makes sense to create it in the same data center where the virtual machine with SQL Server is located, on which it is supposed to restore the backup to save on traffic between data centers. For example, in my case, the lvmtest4 virtual machine with the default instance of SQL Server is located in the North Europe location. A tststorage account will be created there to transfer the database backup. The creation process is described in more detail in the documentation for Windows Azure (http://msdn.microsoft.com/en-us/library/windowsazure/gg433066.aspx). Inside the account, we will create the container container1 (capital letters in the container name are not supported). For security purposes, the container will be created as private, which means that to access its contents you will need to specify the Access Key (primary or secondary), which can be viewed in the Storage Account properties.
Of the prerequisites, we still need installed on the local SQL Server 2012 CU2 to SP1. As usual, cumulative updates will most likely be included in the next service pack, but at the time of this writing, it is required to install Service Pack 1 on SQL Server 2012 (if it is not already installed) (http://www.microsoft.com/ en-us / download / details.aspx? id = 35575), and then cumulative update 2 to this service pack (http://support.microsoft.com/kb/2790947). When we went through this question on Windows Azure Summit, the audience drew attention to the phrase This service pack contains SQL Server 2012 Cumulative Update 1 (CU1) and Cumulative Update 2 (CU2), so it’s probably better to clarify it once again. It talks about cumulative updates to the bare SQL Server 2012. We need CU2 to SP1. The link opens a form, note in the form, which file is needed (I noted all three) and indicate the e-mail, which literally immediately receives an email from hotfix@microsoft.com containing links where to get what. For our purposes, it is enough to set SQLServer2012_SP1_CU2_2790947_11_0_3339_x64 so that select @@ version is not lower than Microsoft SQL Server 2012 (SP1) - 11.0.3339.0 (X64), Jan 14 2013 19:02:10, Build 9200.
In order for the local SQL Server to be able to access the cloud story, you first need to create a credential on it, which will indicate the name of the story account and one of its access keys (primary or secondary, no difference). Script 1 Access keys can be found in the management of your account account in the Windows Azure Management Portal, if you select Configure in the upper menu and Manage Keys in the lower menu. Fig. 1 Fig. 2 Actually, that's all. Create a redundant copy of your favorite AdventureWorks database in the cloud story. Script 2


Everything is clear here. The URL string is formed on the basis of <Azure Storage account name>. blob.core.windows.net/ <container name> / <as the backup file created in this container will be called> '. It can be viewed in the Windows Azure Management Portal in the container properties:

Fig . 3
If CU2 to SP1 is not installed, the Msg 155, Level 15, State 1, Line 2 error occurs when this command is executed. 'URL' is not a recognized Device Type option . But we installed it, so everything is successful.

Fig. 4
If you now go into the container container1 in Fig. 3, we will see that the AdventureWorks2012.bak backup was created there.
Comments:
• The maximum backup size should not exceed 1 TB, which is due to limitations of Azure Blob Storage.
• It supports the creation of not only a full backup of the database, but also backups of the transaction log, file groups, differential backup, as well as compression backups.
• If a file with the same name already exists in the container, an error will occur. To rewrite it, use the FORMAT option.
• From the SSMS interface, backing up a database in Azure Blob Storage is not yet supported.
Now we go to the cloud virtual machine with SQL Server installed on it, create an identity (Script 1) and restore the database on it from the freshly created cloud backup absolutely symmetrically. It is clear that SP1 CU2 must also be installed on that SQL Server. Script 3 Fig. 5

Similarly, you can do the opposite: create a backup in the Cloud and raise it to on-premise SQL Server, as well as transfer the database between cloud or on-premise machines.
We examined the process of creating a storage account in an article on the Blob Service REST API (http://blogs.technet.com/b/isv_team/archive/2012/10/25/3528566.aspx) and we will not dwell on this in detail now. Let me remind you just in case that it makes sense to create it in the same data center where the virtual machine with SQL Server is located, on which it is supposed to restore the backup to save on traffic between data centers. For example, in my case, the lvmtest4 virtual machine with the default instance of SQL Server is located in the North Europe location. A tststorage account will be created there to transfer the database backup. The creation process is described in more detail in the documentation for Windows Azure (http://msdn.microsoft.com/en-us/library/windowsazure/gg433066.aspx). Inside the account, we will create the container container1 (capital letters in the container name are not supported). For security purposes, the container will be created as private, which means that to access its contents you will need to specify the Access Key (primary or secondary), which can be viewed in the Storage Account properties.
Of the prerequisites, we still need installed on the local SQL Server 2012 CU2 to SP1. As usual, cumulative updates will most likely be included in the next service pack, but at the time of this writing, it is required to install Service Pack 1 on SQL Server 2012 (if it is not already installed) (http://www.microsoft.com/ en-us / download / details.aspx? id = 35575), and then cumulative update 2 to this service pack (http://support.microsoft.com/kb/2790947). When we went through this question on Windows Azure Summit, the audience drew attention to the phrase This service pack contains SQL Server 2012 Cumulative Update 1 (CU1) and Cumulative Update 2 (CU2), so it’s probably better to clarify it once again. It talks about cumulative updates to the bare SQL Server 2012. We need CU2 to SP1. The link opens a form, note in the form, which file is needed (I noted all three) and indicate the e-mail, which literally immediately receives an email from hotfix@microsoft.com containing links where to get what. For our purposes, it is enough to set SQLServer2012_SP1_CU2_2790947_11_0_3339_x64 so that select @@ version is not lower than Microsoft SQL Server 2012 (SP1) - 11.0.3339.0 (X64), Jan 14 2013 19:02:10, Build 9200.
In order for the local SQL Server to be able to access the cloud story, you first need to create a credential on it, which will indicate the name of the story account and one of its access keys (primary or secondary, no difference). Script 1 Access keys can be found in the management of your account account in the Windows Azure Management Portal, if you select Configure in the upper menu and Manage Keys in the lower menu. Fig. 1 Fig. 2 Actually, that's all. Create a redundant copy of your favorite AdventureWorks database in the cloud story. Script 2
if exists (select 1 from sys.credentials where name = 'SqlToAzureStorage' and credential_identity = 'tststorage') drop credential SqlToAzureStorage
CREATE CREDENTIAL SqlToAzureStorage
WITH IDENTITY= 'tststorage' --storage account
, SECRET = 'oY1AUn6/5/IWz8dfQJzidOVY8HRUKOz1k5MsSnV86xV46fEtQCAigC3Fd8Lgkn2fv6SotsRpZm6w2tRaQVAovw=='


BACKUP DATABASE AdventureWorks2012
TO URL = 'https://tststorage.blob.core.windows.net/container1/AdventureWorks2012.bak'
WITH CREDENTIAL = 'SqlToAzureStorage', INIT, STATS = 10
Everything is clear here. The URL string is formed on the basis of <Azure Storage account name>. blob.core.windows.net/ <container name> / <as the backup file created in this container will be called> '. It can be viewed in the Windows Azure Management Portal in the container properties:

Fig . 3
If CU2 to SP1 is not installed, the Msg 155, Level 15, State 1, Line 2 error occurs when this command is executed. 'URL' is not a recognized Device Type option . But we installed it, so everything is successful.

Fig. 4
If you now go into the container container1 in Fig. 3, we will see that the AdventureWorks2012.bak backup was created there.
Comments:
• The maximum backup size should not exceed 1 TB, which is due to limitations of Azure Blob Storage.
• It supports the creation of not only a full backup of the database, but also backups of the transaction log, file groups, differential backup, as well as compression backups.
• If a file with the same name already exists in the container, an error will occur. To rewrite it, use the FORMAT option.
• From the SSMS interface, backing up a database in Azure Blob Storage is not yet supported.
Now we go to the cloud virtual machine with SQL Server installed on it, create an identity (Script 1) and restore the database on it from the freshly created cloud backup absolutely symmetrically. It is clear that SP1 CU2 must also be installed on that SQL Server. Script 3 Fig. 5
RESTORE DATABASE AdventureWorks2012
FROM URL = 'https://tststorage.blob.core.windows.net/container1/AdventureWorks2012.bak'
WITH CREDENTIAL = 'SqlToAzureStorage', REPLACE, STATS = 10

Similarly, you can do the opposite: create a backup in the Cloud and raise it to on-premise SQL Server, as well as transfer the database between cloud or on-premise machines.