Tito.Net

Saturday, March 31, 2018

Scenario

This article is going to focus on providing guidance and library for copying Azure BLOB to Amazon S3. Using .NET (C#), have created few classes which will help us to achieve copying large BLOB into Amazon S3.

Points to be considered

As we're targeting large data objects to be copied, we need perform Multipart upload.

High-Level API provides us with some sophisticated way of performing few common operations on S3 by encapsulating core level implementations.

Low-Level API as you might have guessed, exposes the core Multipart operations on S3.

For our scenario, Low-Level API will be the best fit as we need few detailed control over the operations.

Maximum of 10000 parts are allowed per upload.

Part size can be of range 5MB to 5GB with an exception to the last part which can be less than 5MB.

To handle our scenario, we're going to stick to 100MB as part size in our library.

Library Source code

Following classes has been created to facilitate developers utilize as library encapsulating the implementation logic of the copy operation and provide Request and Response object for ease of use and code management.

Sunday, February 18, 2018

Introduction

You might need to encounter situations where you need to calculate the checksum of file / stream while transmitting across the wire. Nowadays, it is common to transmit the file stream frequently and during that scenario, you need to ensure data has not been corrupted during this transmission process. For which in the receiving end you need to use the same algorithm and recalculate the checksum to ensure the transmitted data is not corrupted.

Scenario

Let me take the same scenario I explained in my previous post Download large files as chunks and upload them into BLOB. In which we downloaded and transmitted large file as stream in chunks. There are lot of articles over web explaining calculating checksum for full file stream. Here we'll see the snippet below for the case of calculating checksum for chunks and get them accumulated at the end.

Tips

HashAlgorithm.TransformBlock and HashAlgorithm.TransformFinalBlock will help you achieve this.

Sunday, February 11, 2018

Introduction

In real case scenarios there are time where we need to download large files from from a Web resource and save it to Azure BLOB storage. Even though there are few articles over Web which helped, I'm not able to get a end to end working solution for files with size ranging from 20GB to 30GB. Have detailed below challenges and related solutions for your convenience.

Problem statement

Download large files from the Web resource and upload them into Azure BLOB storage.

Challenges and Solutions

First we'll hit memory issue when we try to read the full stream and load all bytes into memory. An article here explains well how to avoid this Memory issue.

Another article here detailed how to read FileStream as chunks and upload them into BLOB. Using the details from this and #1 we can try to achieve the solution.

Use Stream.Read by passing respective parameters. In that one of the parameter is maximum number of bytes to read from the current stream. And the Stream.Read method returns the bytes read. But that also got into issue as the Stream.Read method returns total number of bytes read which can be less than the maximum count parameter. Detailed documentation of Stream.Read can be found here.

Along with that, we can use CloudBlockBlob.PutBlock method to upload read chunks into BLOB. Points to note here are, each block (chunk) in BLOB can be maximum of 100MB in size and at the max you can have 50000 blocks (100MB x 50000 blocks = 4.75TB). Detailed documentation can be found here.

Even though #2.1 and #2.2 looks straight forward, there might be issues because Read can read less number of bytes which in turn increases the number of blocks (more than 50000 limit) in the Azure BLOB storage.

In order to avoid issue mentioned above in #2.3, we need to accumulate the resultant of Stream.Read till it reaches the expected size (100MB in our case).

Even though you fix the issues specified in #1 and #2 above, you might face exception stating An existing connection was forcibly closed by the remote host. . Though I'm not able to identify the solution for this, have found workaround using the articles here and here. Use Http1.0 instead of Http1.1.

Monday, October 30, 2017

Introduction

Azure Service Fabric Services can be configured to exhibit trigger based behavior similar to WebJobs and Azure Functions.

Time triggered / Scheduler Service in Azure Service Fabric

To create a time triggered (scheduler) Service in Azure Service Fabric, I can think of following two quick options among the possible ways.

Create a time triggered WebJob and add/deploy it as a guest executable in Azure Service Fabric.

Create Azure Service Fabric Stateless Service and implement the listener to handle jobs.

Here I'm taking the second approach which involves the respective listener to be created. Rather than writing new Scheduling framework, I'm using the Quartz.Net framework. I'm using the CRON expression behavior to have the behavior inline with the time trigger behavior of WebJobs and Azure Functions.

Introduction

In ASP.NET WEB API / MVC in the request input model if you have some property that accepts bool type with content type application/json, it might even accept numbers which will be converted to bool type. If you send numbers as input for bool, it'll be converted to false for 0 and true for everything else (with an exception observed - 08 and 09 as null).

Problem statement

To restrict from numbers being accepted as input for bool type property in JSON content type.

Saturday, October 28, 2017

Requirement

Calculate execution time of ASP.NET WEB API actions

Create a common code that can calculate and log the details

Project uses Autofac for Dependency Injection.

If you've integrated Azure Application Insights to your WEB API, automatically it'll log/capture the execution time of each request. If you need it as additional logging or in place where Application Insights not associated, this will be helpful. Let us get into the code directly.

Approach

We will be using ActionFilter to write the logic for calculating the execution time.

In case of WEB API that uses Autofac as dependency resolver, we can utilize the IAutofacActionFilter to achieve the same.

Register the respective action filter globally to execute for all controllers and actions, so that you don't need to worry when you add new controllers/actions.

In the below example I provided the sample snippet related to Autofac based solution. Ignore the Custom logger in the below example as it is just for portraying the sample.

Introduction

Visual Studio Team Services (VSTS) provide the facilitation to incorporate Continuous Integration and Continuous deployment. For Azure WebApp/WebJobs Documentation and Blogs are available (like this, this ) to guide you how to achieve those. In this post we're going to have a look at scenario where the Web deployment package itself might not get created and solution for that.

But even after that Web deployment package zip file not getting created.

Possible Cause and Solution

Ensure you have the Microsoft.Web.WebJobs.Publish NuGet package added to the respective project. If the package is not associated, in your CI execution Buildstep won't throw any exception, rather it simply won't create the web deployment zip package.