Tuesday, March 15, 2022

Azure Large Block Blob Upload with SAS and REST

Attempting to find an example of uploading a large block blob (and thus breaking into blocks with a finalization using the blocklist api) in Azure using the SAS token approach with the REST endpoint resulted in finding a Javascript example. I worked through the code and turned it into C#. The important parts are that you add the comp and blockid as Base64 encoded strings to the end of the SAS url; and finalize the upload by calling the comp endpoint with an xml section with all the blocks listed.

async Task Main()
{
	// Note: References Nuget Package RestSharp
	var azureSasTokenGetUrl = "";
	var maxBlockSize = 1024 * 1024 * 10; // 10MB block size	
	var fileToUpload = @"";
	var info = new FileInfo(fileToUpload);

	// custom URL for getting sas token accepted a file name value
	var client = new RestClient($"{azureSasTokenGetUrl}?name={info.Name}");
	var response = await client.GetAsync(new RestRequest());
	if (response.IsSuccessful)
	{
		// clean url of any quotes
		var sasUrl = response.Content.Replace("\"", string.Empty);
		Console.WriteLine($"SAS URL: {sasUrl}");

		var blobList = new List();
		var uploadSuccess = false;
		var fileLength = info.Length;
		var numberBlocks = (fileLength % maxBlockSize == 0) ? (int)(fileLength / maxBlockSize) : (int)Math.Floor((double)fileLength / maxBlockSize) + 1;
		client = new RestClient(sasUrl);
		using (var stream = File.OpenRead(fileToUpload))
		{
			for (var i = 0; i < numberBlocks; i++)
			{
				var readSize = (int)(i == (numberBlocks - 1) ? fileLength - (i * maxBlockSize) : maxBlockSize);
				var buffer = new byte[readSize];
				var blockId = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(i.ToString().PadLeft(6, '0')));
				var blockUrl = $"{sasUrl}&comp=block&blockid={blockId}";
				Console.WriteLine($"{i}: {i * maxBlockSize}, length {readSize}");
				blobList.Add(blockId);

				var content = await stream.ReadAsync(buffer, 0, readSize);
				var request = new RestRequest(blockUrl, Method.Put);
				request.AddBody(buffer);
				request.AddHeader("x-ms-blob-type", "BlockBlob");
				response = await client.PutAsync(request);
				uploadSuccess = response.IsSuccessful;
				if (!response.IsSuccessful)
					break;
			}
			stream.Close();
		}

		if (uploadSuccess)
		{
			var finalize = new RestRequest($"{sasUrl}&comp=blocklist");
			finalize.AddStringBody($"<BlockList>{string.Concat(blobList.Select(b => "<Latest>" + b + "</Latest>"))}</BlockList>", DataFormat.Xml);
			response = await client.PutAsync(finalize);
			Console.WriteLine($"Success: {response.IsSuccessful}");
		}
		else
			Console.WriteLine("FAILED");
	}
}

test

Thursday, January 6, 2022

Authoring resource with Luis

Azure LUIS is a great cognitive service that requires two resources to be provisioned in order to work properly. The predictive resource can be created in almost any Azure region, but the authoring resource is limited to specific regions. The authoring resource is where the luis.ai site will save and 'train' your configuration; you'll then publish that configuration and it will be used by the predictive engine as you submit your requests. You can think of these two as training and competition. The authoring is where you lift weights, work on your cardio, practice your forms; the predictive is the applied results of your training. 

Azure functions and SCM_DO_BUILD_DURING_DEPLOYMENT

Deploying Azure Functions via Azure Pipelines, I discovered that if your code is already compiled, you should tell Azure not to build it on deploy. This is done with an app setting SCM_DO_BUILD_DURING_DEPLOYMENT which is set to false. The standard practice is to build your artifacts once, then deploy them to subsequent environments--the same build artifact is deployed to each environment. If your code is already built, without the above setting, you'll get weird errors such as: "Error: Couldn't detect a version for the platform 'dotnet' in the repo." That error indicates it is trying to build your code. Use the above app setting to prevent the build.