Stream video with Azure and .NET

    As part of the Jisp project, which is being developed by WaveAccess, the task of downloading and playing videos arose. In general, nothing unusual, but there are known problems associated with it, such as the need for preprocessing, as well as the need to support the ability to stream in the required quality and format. Video should play successfully on major platforms, namely Android, iOS, and desktop (web). Read more about solving problems under the cut!

    Jisp is an application for buyers and owners of online and offline stores. Installed on the user's mobile device. Shares discounts and special offers, collects browsing history, compiles a user portrait for retailers.

    Integrated with iBeacons. They are installed in an offline store for a product (clothing, equipment), and when viewing this product (touching a tag) by a visitor, a special offer and detailed information about the product appear on the mobile device. The application implements an analogue of the social network for shopping lovers: you can share videos, interesting products, reviews about them.

    In 2017, the application received the Microsoft Partners Award in the Business nomination for a machine learning module that allows you to create an accurate portrait of the user based on his choice online and offline and send out personal offers.

    To solve this problem, we used Azure Media Services technology. The server side uses the .Net platform. In addition to the standard requirements associated with video streaming, there were also requirements for the business logic of the application. Users can upload videos and attach them to a specific geolocation. Full video should only be available in the specified area. In other cases, other users can play only a short fragment specified by the author of the video. In addition, you need to crop the video to square proportions.

    Access setting

    Before proceeding to download the video, you must configure the storage and access to it. To do this, prepare 2 accounts: Media service account and Azure storage Account.

    Media service account

    He is responsible for storing video metadata. When creating this type of account, you need to set the following set of parameters:

    • Resource group - used to group resources in Azure in the ability to configure access to the entire group
    • Location - the territorial location of the data center in which the resource will be created
    • Storage account - used to save assets files

    Storage account

    This type of account is designed to store files, in this case it is directly responsible for storing video.


    After the accounts are created, you need to configure authorization. Azure media services uses authorization through Azure Active Directory. There are 2 authorization options:

    • User
    • Service principal

    The first approach makes sense if we do not have an intermediate service that would take on authorization. For example, if we are dealing with a desktop application that interacts directly with Azure Media Services. In this case, we will ask the user for a username and password for Azure Active Directory.

    Authorization of user accounts The

    second option is more convenient if we have our own service, which is responsible for authorizing users in the application. For example, if we have a mobile or web application with its back-end service. In this case, the keys for access to AMS are stored on the server, and user authorization lies with the business logic of the application.

    Authorization using Service Principal

    In the case of Jisp, the second option is selected.

    Having prepared accounts and authorization, we can begin integration with the .Net SDK for AMS. The main object responsible for working with AMS entities is CloudMediaContext. Below is the code that is responsible for initializing it:

    var azureKey = new AzureAdClientSymmetricKey(clientId, clientKey);
    var credentials = new AzureAdTokenCredentials(tenant, azureKey, AzureEnvironments.AzureCloudEnvironment);
    var tokenProvider = new AzureAdTokenProvider(credentials);
    var context = new CloudMediaContext(new Uri(url), tokenProvider);


    After we initialized CloudMediaContext, we can proceed to downloading the video. To do this, AMS uses asset entities. This is a resource that directly contains the video stream, audio, thumbnail images and other files related to the video. First, you need to create the asset-resource itself.

    var inputAsset = context.Assets.Create(assetName, storageName, AssetCreationOptions.None);

    After that, you need to add the source video file to the asset:

    var assetFile = inputAsset.AssetFiles.Create(fileName);

    Next, you need to convert it for adaptive streaming:

    var job = context.Jobs.CreateWithSingleTask("Media Encoder Standard", "Adaptive Streaming", inputAsset, "Adaptive Bitrate MP4", AssetCreationOptions.None);
    job = job.StartExecutionProgressTask(
        j =>
            Console.WriteLine("Job state: {0}", j.State);
            Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
        }, CancellationToken.None).Result;
    Console.WriteLine("Transcoding job finished.");
    var outputAsset = job.OutputMediaAssets[0];

    After the video is ready, we publish it using the Locator:

    context.Locators.Create(LocatorType.OnDemandOrigin, outputAsset, AccessPermissions.Read, TimeSpan.FromDays(365));

    Now we can play our video.

    var videoUri = outputAsset.GetMpegDashUri();


    Azure Media Services offers a filter engine to modify video. It is possible to specify the quality of audio or video, resolution, codec used, select a specific interval.

    There are 2 types of filters:

    • Local. Such filters can only be applied to the video for which they were created. Accordingly, for each video you need to create a separate filter.
    • Global. Such filters can be applied to any video in this account. They are especially relevant when you need to support streaming video in various formats for various devices.

    The following is an example of creating a filter that limits video playback to a specified interval:

    var filterName = $"{length}seconds";
    	new PresentationTimeRange(start: 0, end: Convert.ToUInt64(10000000 * length)),
    	new List());

    To apply the created filters, you need to modify the url for streaming as follows.

    Source url: (format = mpd-time-csf)

    url after adding the filter 10seconds:
    testendpoint-testaccount.streaming / fecebb23-46f6-490d-8b70-203e86b0df58 / BigBuckBunny.ism / Manifest (format = mpd-time-csf, filter = 10seconds)

    If we need to apply several filters, the url will be the following:
    testendpoint-testaccount. (format = mpd-time-csf, filter = 10seconds; square)


    Like any cloud service, AMS greatly simplifies the scaling process.
    In terms of video preprocessing, AMS provides the following options:

    1. Vertical scaling, in this case, an increase in video processing speed. There are 3 tariffs with performance x1, x2 and x4 respectively. Therefore, if the time for preparing the video is not critical, you can choose the minimum rate. If you need to minimize processing time to a minimum, it makes sense to use x4.
    2. Horizontal scaling, i.e. the ability to simultaneously process multiple videos. This is easily achieved by setting the required number of Media processing units.

    In terms of video streaming, AMS supports automatic bandwidth changes as needed. There is also a division into standard and premium tariffs. The first is designed for bandwidth up to 600Mbps. The second is 200 Mbps for each streaming unit.


    For example, take 720p video quality, 30 seconds, 1 minute and 5 minutes in length, and compare the preprocessing time. The results are as follows:

    Rate30 sec (4.4 Mb)1 minute (8.1 Mb)5 minutes (44.9 Mb)
    x11:32 (49 Kb / s)2:14 (62 Kb / s)8:38 (89 Kb / s)
    x21:07 (68 Kb / s)1:31 (91 Kb / s)5:07 (150 Kb / s)
    x40:33 (137 Kb / s)0:45 (185 Kb / s)1:49 (422 Kb / s)
    For small videos, the increase in preprocessing performance is not so significant, especially for the x2 tariff. But as the situation increases, the situation changes for the better, and the tariffs x2 and x4 give a corresponding increase in processing speed.

    Underwater rocks

    Azure Media Services, like any technology, imposes its limitations.

    The first difficulty we encountered was the delay between the moment the video was uploaded and the moment it was ready for streaming. The ideal option is to be able to stream immediately after loading. In principle, AMS makes it possible to get a link to the raw video. But preparing asset files still turned out to be longer than just loading videos into Azure Storage bypassing AMS. In this regard, we decided to first upload the source video to Azure Storage bypassing AMS and give it directly for the preprocessing period.

    The next difficulty was related to changing the proportion of the video. In our case, AMS incorrectly processed non-standard proportions (1: 1). Despite the fact that the video itself became square, the metadata showed a different ratio, and the thumbnail image was distorted. In this regard, this task had to be implemented using ffmpeg.

    There was also a certain difficulty with the indication of the fragment being played. By default, the duration was a multiple of 6 seconds. Using the advanced configuration, it was possible to reduce the step to two seconds, which was acceptable.

    Otherwise, Azure Media Services does its job well, providing a turnkey solution for streaming video to many clients.

    about the author

    The WaveAccess team creates technically sophisticated, highly loaded, fault-tolerant software for companies around the world. Alexander Azarov , senior vice president of software development at WaveAccess, comments:
    At first glance, difficult tasks can be solved by relatively simple methods. It is important not only to learn new tools, but also to perfect the knowledge of familiar technologies.
    Company blog

    Also popular now: