Database migration to Windows Azure SQL VM. BLOB Storage + Azure SDK

    In the previous example, we trained to upload files to Azure Storage using the REST API and uploaded the AdventureWorks2012 database backup there.
    It remains to download it to a cloudy virtual machine and restore it to the installed SQL Server in it. In this regard, working with Azure Storage is completely symmetrical, from the on-premise side of the client, from the side of the cloud virtual machine - they transfer files to each other through Azure Storage. One downloads there, the second reads.

    Since container1 was created as a public container, it is not necessary to generate a digital signature to list and read the blobs contained in it:
    tststorage.blob.core.windows.net/container1/aaa.txt

    If the container was created as private, you need to log in to read blobs from it. As we recall, for this it is not enough to transfer the primary or secondary access key to the Azure Storage account. It is required to carefully form a string of a certain format and mix the MD5 hash, as was done in Script 2 of the previous post. Working with Azure Storage through the REST API is convenient in that it does not involve the installation of additional tools, bypassing essentially HTTP Request / Response, but it requires painstaking work. In this post we use the Azure SDK, which is convenient in that it masks the preparatory work inside and has more human-readable interfaces for reading / writing blobs, managing containers, etc. It is free and comes with the Windows Azure .NET Developer Center. To install the Windows Azure SDK, you will need Visual Studio 2010 SP1 or 2012. During installation, Web Platform Installer 4.0 is launched, which performs a further installation. The SDK includes the Windows Azure Authoring Tools - June 2012 Release, Windows Azure Emulator, Windows Azure Libraries for .NET 1.7, Windows Azure Tools for Microsoft Visual Studio and for LightSwitch. Cloud emulator is a convenient thing that allows you to create cloud-based applications locally so that you do not have to pay extra money for the computation time and place in the Azure storyboard during debugging. It consists of an emulator for launching cloud services and a cloud storage emulator for tables, queues and blobs. In the links to the REST API documentation that were given in the previous post, you probably noticed that along with specifying the Request URI for the GET, PUT, ... methods, there is an Emulated Storage Service URI,
    After installing the SDK, it becomes possible to use the Server Explorer in Visual Studio to view information related to the Cloud. The Windows Azure Storage node initially displays storage emulator objects. To connect to Azure Storage, you need to specify Storage Account.

    Here Account Key is one of the primary / spare pair that we saw in Figure 10 of the previous post. Now the list of containers and their contents can be viewed directly from the Visual Studio environment similarly to the Azure Management Portal.
    A blob can be opened by selecting the Open item from its context menu or by clicking on the open button on the top line or by simply double-clicking on a blob. At the same time, it is downloaded to the local temporary directory. The blob can be edited and saved to a local file. To save it (or any other local file) in Azure Storage in VS 2010 there was a Upload Blob button. In 2012, I do not see her point blank, and I'm not alone .
    Use the Azure SDK object model to read a file from Azure Storage. With its help, the code is shorter and more readable in comparison with the REST API.

    using System;
    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    using System.IO;

    class Program
    {
    static void Main (string [] args)
    {
    string storageAccount = "tststorage";
    string accountPrimaryKey = "xws7rilyLjqdw8t75EHZbsIjbtwYDvpZw790lda0L1PgzEqKHxGNIDdCdQlPEvW5LdGWK / qOZFTs5xE4P93A5A ==";
    string blobContainer = "container1";
    string blobName = "AdventureWorks2012.bak"; // choiceaaa.txt ";

    CloudStorageAccount acct = new CloudStorageAccount (new StorageCredentialsAccountAndKey (storageAccount, accountPrimaryKey), true);
    CloudBlobClient clnt = acct.CreateCloudBlobClient ();
    CloudBlobContainer cntr = clnt.GetContainerReference (blobContainer);
    CloudBlob blob = cntr.GetBlobReference (blobName);

    blob.DownloadToFile (@ "c: \ temp \ AdventureWorks2012.bak");
    }
    }
    Script 1

    First, you need to add a link to Microsoft.WindowsAzure.StorageClient in the Extensions project.
    We say Build, we take the resulting executable and the Microsoft.WindowsAzure.StorageClient.dll library from the Bin \ Debug solution folder, copy them through a remote access session to a cloud virtual machine that does not have Visual Studio or the Windows Azure SDK, run exe, and the AdventureWorks2012 file .bak is downloaded from Azure Storage inside the virtual machine. In time it takes about a minute. Then we open SSMS and restore the backup to SQL Server on a virtual machine in the Cloud.
    It should be noted that the described method of transferring the backup from the local machine to the cloud virtual machine through Azure Storage did not cost us anything in terms of traffic, because upstream traffic to the Cloud, by definition, is free, and the Storage Account for intermediate backup storage was created in the same data center as the virtual machine, i.e. downloading a backup from tststorage to a virtual machine also cost nothing. However, from the point of view of the occupied space, AdventureWorks2012.bak was stored twice: first it was uploaded to the Blob story, then it was downloaded to vhd, which in fact is also the Blob story. For large backups, this can incur additional space costs — see pricing, Storage section . In the next post we will see how to optimize these costs.

    Also popular now: