tag:blogger.com,1999:blog-44606638171808231012024-03-14T02:52:37.065-05:00Notes on Software DevelopmentContains notes and lessons in working technology, especially .NET, Azure, DevOps, Agile, and Team Foundation Server.Unknownnoreply@blogger.comBlogger48125tag:blogger.com,1999:blog-4460663817180823101.post-10764213029455427842022-03-15T14:48:00.010-05:002022-03-15T14:59:41.517-05:00Azure Large Block Blob Upload with SAS and REST<p>Attempting to find an example of uploading a large block blob (and thus breaking into blocks with a finalization using the blocklist api) in Azure using the SAS token approach with the REST endpoint <a href="https://gauravmantri.com/2013/02/16/uploading-large-files-in-windows-azure-blob-storage-using-shared-access-signature-html-and-javascript/">resulted in finding a Javascript example</a>. I worked through the code and turned it into C#. The important parts are that you add the comp and blockid as Base64 encoded strings to the end of the SAS url; and finalize the upload by calling the comp endpoint with an xml section with all the blocks listed.</p>
<pre><code class="csharp">async Task Main()
{
// Note: References Nuget Package RestSharp
var azureSasTokenGetUrl = "<your sas retrieval url>";
var maxBlockSize = 1024 * 1024 * 10; // 10MB block size
var fileToUpload = @"<path to a big ole file>";
var info = new FileInfo(fileToUpload);
// custom URL for getting sas token accepted a file name value
var client = new RestClient($"{azureSasTokenGetUrl}?name={info.Name}");
var response = await client.GetAsync(new RestRequest());
if (response.IsSuccessful)
{
// clean url of any quotes
var sasUrl = response.Content.Replace("\"", string.Empty);
Console.WriteLine($"SAS URL: {sasUrl}");
var blobList = new List<string>();
var uploadSuccess = false;
var fileLength = info.Length;
var numberBlocks = (fileLength % maxBlockSize == 0) ? (int)(fileLength / maxBlockSize) : (int)Math.Floor((double)fileLength / maxBlockSize) + 1;
client = new RestClient(sasUrl);
using (var stream = File.OpenRead(fileToUpload))
{
for (var i = 0; i < numberBlocks; i++)
{
var readSize = (int)(i == (numberBlocks - 1) ? fileLength - (i * maxBlockSize) : maxBlockSize);
var buffer = new byte[readSize];
var blockId = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(i.ToString().PadLeft(6, '0')));
var blockUrl = $"{sasUrl}&comp=block&blockid={blockId}";
Console.WriteLine($"{i}: {i * maxBlockSize}, length {readSize}");
blobList.Add(blockId);
var content = await stream.ReadAsync(buffer, 0, readSize);
var request = new RestRequest(blockUrl, Method.Put);
request.AddBody(buffer);
request.AddHeader("x-ms-blob-type", "BlockBlob");
response = await client.PutAsync(request);
uploadSuccess = response.IsSuccessful;
if (!response.IsSuccessful)
break;
}
stream.Close();
}
if (uploadSuccess)
{
var finalize = new RestRequest($"{sasUrl}&comp=blocklist");
finalize.AddStringBody($"<BlockList>{string.Concat(blobList.Select(b => "<Latest>" + b + "</Latest>"))}</BlockList>", DataFormat.Xml);
response = await client.PutAsync(finalize);
Console.WriteLine($"Success: {response.IsSuccessful}");
}
else
Console.WriteLine("FAILED");
}
}
</code></pre>
<p>test</p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-68177910190816919182022-01-06T10:40:00.001-06:002022-01-06T10:40:17.239-06:00Authoring resource with Luis<p>Azure LUIS is a great cognitive service that requires two resources to be provisioned in order to work properly. The predictive resource can be created in almost any Azure region, but the authoring resource is <a href="https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-regions">limited to specific regions</a>. The <i>authoring </i>resource is where the <a href="http://luis.ai">luis.ai</a> site will save and 'train' your configuration; you'll then publish that configuration and it will be used by the <i>predictive </i>engine as you submit your requests. You can think of these two as training and competition. The authoring is where you lift weights, work on your cardio, practice your forms; the predictive is the <i>applied results</i> of your training. </p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-41362413082199065312022-01-06T10:21:00.001-06:002022-01-06T10:21:29.914-06:00Azure functions and SCM_DO_BUILD_DURING_DEPLOYMENT<p>Deploying Azure Functions via Azure Pipelines, I discovered that if your code is already compiled, you should tell Azure not to build it on deploy. This is done with an <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies">app setting</a> <span style="font-family: courier;">SCM_DO_BUILD_DURING_DEPLOYMENT </span>which is set to false. The standard practice is to build your artifacts once, then deploy them to subsequent environments--the same build artifact is deployed to each environment. If your code is already built, without the above setting, you'll get weird errors such as: "<span style="color: red;">Error: Couldn't detect a version for the platform 'dotnet' in the repo.</span>" That error indicates it is trying to build your code. Use the above app setting to prevent the build.</p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-21335372114677068132021-12-06T17:11:00.006-06:002021-12-06T17:11:43.608-06:00Setting up a new work laptop<p>I started at a new company today and am setting up my new laptop. I thought I would spend a few minutes writing my standard list of software that I install by default, for my future self (and perhaps, others).</p><p>Must have software:</p><p></p><ul style="text-align: left;"><li><a href="https://notepad-plus-plus.org/">Notepad++</a></li><li><a href="https://linqpad.net/">Linqpad</a></li><li><a href="https://www.microsoft.com/en-us/microsoft-365/onenote/digital-note-taking-app">OneNote</a></li><li><a href="https://7-zip.org/">7-Zip</a></li><li><a href="https://code.visualstudio.com/">Visual Studio Code</a></li><li><a href="https://www.postman.com/">Postman</a></li></ul><div>For Microsoft Development:</div><div><ul style="text-align: left;"><li><a href="https://visualstudio.microsoft.com/">Visual Studio</a></li><li><a href="https://www.microsoft.com/en-us/Download/details.aspx?id=101064">SQL Express</a></li><li><a href="https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-ver15">SSMS</a></li></ul><div>For Git integration:</div></div><div><ul style="text-align: left;"><li><a href="https://github.com/dahlbyk/posh-git">Posh-Git</a></li></ul><div>I also run, as Visual Studio add-ins:</div></div><div><ul style="text-align: left;"><li><a href="https://www.devexpress.com/products/coderush/">DevExpress CodeRush</a></li><li><a href="https://www.ncrunch.net/">NCrunch</a></li><li><a href="https://www.red-gate.com/products/sql-development/sql-prompt/">RedGate SQL Prompt </a>(SSMS add-in too)</li></ul><div>For Azure development:</div></div><div><ul style="text-align: left;"><li><a href="https://docs.microsoft.com/en-us/azure/cosmos-db/local-emulator?tabs=ssl-netstd21">Cosmos DB Emulator</a></li><li><a href="https://github.com/paolosalvatori/ServiceBusExplorer">Service Bus Explorer</a></li><li><a href="https://azure.microsoft.com/en-us/features/storage-explorer/#overview">Azure Storage Explorer</a></li></ul><div>I'll update as I think of more.</div></div><p></p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-91853853744703237732018-08-22T13:11:00.001-05:002018-08-22T13:15:53.592-05:00Cosmos Change Feed and Lease Collection Use<p>The <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed">change feed</a> functionality in Azure Cosmos DB is a great feature with a wide variety of uses. The change feed subscribes to changes that occur in a collection and keeps its state <em>in another collection</em>, called the <a href="https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed#understanding-the-change-feed-processor-library">Lease Collection</a>. Because the minimum RUs in a collection are 400 and a single lease hardly uses 50 of the RUs, we had decided to store additional meta data documents in the same lease collection, because we could not use the Continuation Token and Sequence Number data to determine where in the stream the subscriber was (for our own monitoring purposes). <strong>Don’t do this</strong>! The change feed library will regularly scan and pull all documents in this collection and use it to balance the subscribers and therefore the more documents are in this collection, the greater the RU requirements. Our RUs were bumping against 4,000 when we finally realized what it was doing:</p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEhQtxUxo5WqMTE5qnDPKx6cA5yRoXBB6bUGNZCm5w_Oaj_hd8EcQi_PKxPykE7CGgcltmKN5zEONGxpgpil8Ob2gdy6oxKs-YvQEJ98VAyHjQm___Klzfw4DTtCHnVZRyPvOgUWs5pxW_/s1600-h/image%255B7%255D"><img title="image" style="display: inline; background-image: none;" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEip6VqUvq5XVJ16aakuXeDA-p11OccR-gRaXI4pJWaRWPd3zRhmYr1QOxaZ8K8pRAUtBdKwWNckf4ahl1tdRBm_ai7uRi4oy-Ifn9gV8QZxs0yQo86yRg94vvsa3uxNKBtt137GtjYuNhP9/?imgmax=800" width="286" height="184" /></a></p> <p>After removing our additional metadata documents, the lease collection RUs are much better (note that we have several subscribers using the same lease).</p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj764uXPB18TX0xA4AcW-AZx9iCx7rsLGjLNU1TOCVcjjbpgFWlZQR5VGly0ZEl9s_UjWHsKkQ9RUwWunCcIcPfssjqWiVqXzGp4na7lflbPyq0HWSesaYvO-M2XqByX60NpoMuOpuwmabW/s1600-h/image%255B3%255D"><img title="image" style="display: inline; background-image: none;" border="0" alt="image" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihUCTC1_W1HZJINblVBhc30f7KBVG8q_NA9g3MwQdMZDKFzpAftAbsA2admWQeYqS827zmcOdOMAOOD2Ib0IwlC41mDH63Ig4fmdLha_5EHsEQXoqOGNeIsupUfM8U4E4b5MCgbI5uDA_P/?imgmax=800" width="287" height="188" /></a></p> <p>Now we store our meta data documents in another collection, which happily sits at around 100 RUs, while leases, as you can see above, hovers around the same.</p> <p>Bottom line: <strong>don’t store anything custom</strong> in the Lease Collection of the change feed because it exponentially increases the RU requirements.</p>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-79502575083454605752017-07-12T21:10:00.002-05:002017-07-12T21:10:40.938-05:00TFS Hosted Build Controller and .NET 4.7If you want the TFS hosted build controller to run .NET 4.7 builds, make sure to change the default agent queue to the "Hosted VS2017" version. You can do this by editing the build definition and in the Tasks Process landing page, it is the second option, as shown below.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1SObKWeHJayEU1yNY94w4NXuMC7GnFpKL30saLjn6waXdmisQSoZ_KrzSO4R64vxuf-ZoMVHUJQImwsnTbKHXA91xb2oNR-emEnMnfvvik0_B3anRpdlcSv7yyB4aaidcHijGPX5yVoyo/s1600/2017-07-12_17-35-54.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="320" data-original-width="638" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1SObKWeHJayEU1yNY94w4NXuMC7GnFpKL30saLjn6waXdmisQSoZ_KrzSO4R64vxuf-ZoMVHUJQImwsnTbKHXA91xb2oNR-emEnMnfvvik0_B3anRpdlcSv7yyB4aaidcHijGPX5yVoyo/s400/2017-07-12_17-35-54.png" width="400" /></a></div>
<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-26693404991317363012017-06-04T16:27:00.000-05:002017-06-04T16:27:46.975-05:00Backups, Synology, and the Recycle BinI am a big fan of Synology products, owning their <a href="http://amzn.to/2sF8qzC">5-bay DS1515+ NAS series</a> (now 1517+) and <a href="http://amzn.to/2qNPnm5">the RT2600 router</a>. The <a href="https://www.synology.com/en-us/dsm">Synology software</a> is fantastic, user-friendly, and allows additional packages to be installed, further extending the features of the product. Within the last year or so, they've greatly improved their cloud-syncing features, and <a href="https://www.synology.com/en-us/dsm/app_packages/CloudSync">the CloudSync package </a>provides an easy way to sync files to/from Synology to cloud providers. They also have packages for syncing to other storage options (such as <a href="https://www.synology.com/en-us/dsm/app_packages/GlacierBackup">Amazon's Glacier storage</a>).<br />
<br />
I had recently configured my NAS drive to backup all local files to <a href="https://www.amazon.com/clouddrive">Amazon Drive</a>, which offers <b>unlimited</b> storage for $59.99 a year (plus tax). Yes, <b>unlimited</b>! It's a great deal. If you don't have a cloud backup location for your files, Amazon's offering is worth looking into. And they have software that you can install across different platforms so you can sync local files and directories to the cloud.<br />
<br />
I was cleaning up the folder structure today in the Synology File Station software and accidentally deleted a root folder (yikes!). I immediately caught the error, paused the sync, but it was too late; some of the files were deleted, both on Synology and on Amazon. Fortunately, Amazon Drive has a "Recycle Bin" so I was able to recover the files. However, this made me enable a feature I had assumed was turned on in Synology: Synology's Recycle Bin. You need to verify yours is turned on too.<br />
<br />
Navigate to the Synology <b>Control Panel</b>, choose the <b>Shared Folders</b> icon, select the appropriate folder, choose <b>Edit</b>, and check the "Enable Recycle Bin" option. Now if you do something terrible like deleting an important folder, at least you won't have to wait for hours to pull it back down, if you are syncing to another location.<br />
<br />
Two lessons learned:<br />
1. Make sure you have <i>more than one backup</i>. Seriously, buy some space on Amazon or another cloud provider, set up a sync, and make sure it completes. It's important!<br />
2. Make sure your folders have an "undelete" option available.<br />
<br />
And just for grins, the files on my Synology that are longer term, unchanging backups, are going to have a third backup location on <a href="https://aws.amazon.com/glacier/">Amazon's Glacier storage</a>, so that I am covered there. At $0.01/month/GB, it's a cheap option.<br />
<br />
<br />
<br />
<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-18210091911183556082017-04-21T21:35:00.000-05:002017-04-21T21:35:02.044-05:00Dictatorial ManagementIssues don't go away simply because you issue an edict or say that the issue will no longer happen.<br />
<br />
One of my favorites is a problem my team runs into almost every day: a developer checks code into the system and breaks the deployment to our test environment. Management's "solution" is that developers shouldn't break the deployment so there's no point in educating the developers on how to troubleshoot and resolve deployment issues.<br />
<br />
Now stop and re-read that last sentence. What!?<br />
<br />
Because an issue shouldn't occur, there's no point in educating people on how to solve the issue when it does occur. This is a typical management solution.<br />
<br />
Another common management solution is to threaten to track the number of issues per developer and have this count reflected on their next performance review. Thus far, I've never seen this done, as to do so would be quite onerous on the manager (and in reality, what does this actually solve?).<br />
<br />
How much more successful would companies, teams, and people be if we would stop the nonsense of impossible solutions? Simply stating 'this issue will never happen again' or to threaten and cajole does nothing to actually solve a problem. Why don't we work in realities and actual possible solutions instead of the ridiculous power-insanity of those at the top or the simplistic manager solutions that do nothing to address the real issues.<br />
<br />
Do you want to actually provide a real solution? How about empowering your employees with the mastery and autonomy to actually care about what they do and the quality with which they do it? What if they had some ownership in the process and the success of what they are doing? What if instead of dictating, you stepped aside and let the team choose? It might not be good for your ego, but it sure would solve a lot more issues than a top down approach.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-64883643514575953842015-11-20T15:00:00.002-06:002021-12-21T16:17:45.319-06:00Getting all changesets associated to work itemsI had a need today to pull a list of all check-ins that had been associated to a certain list of work items. This can be done easily using the TFS API assemblies, most of which are located in C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0. The code below will write the list of all file changes across all changesets for the specified work item IDs. <br />
<pre><code class="csharp">using (TextWriter tmp = Console.Out)
{
using (var fs = new FileStream("Test.txt", FileMode.Create))
{
using (var sw = new StreamWriter(fs))
{
Console.SetOut(sw);
var workItemIds = new int[] { 1,2,3 };
var collectionUri = new Uri("http://yourtfs/server/");
try
{
using (var tpc = new TfsTeamProjectCollection(collectionUri))
{
var workItemStore = tpc.GetService<WorkItemStore>();
var teamProject = workItemStore.Projects["project-name"];
var versionControlServer = tpc.GetService<VersionControlServer>();
var artifactProvider = versionControlServer.ArtifactProvider;
var workItems = workItemStore.Query(workItemIds, "Select [System.Id], [System.Title] from WorkItems");
var allChangesets = new List<Changeset>();
foreach (WorkItem workItem in workItems)
{
allChangesets.AddRange(
workItem.Links.OfType<ExternalLink>().Select(link => artifactProvider.GetChangeset(new Uri(link.LinkedArtifactUri)))
);
}
var orderedChangesets = allChangesets.OrderByDescending(c => c.CreationDate).ToArray();
foreach (var changeset in orderedChangesets)
{
Console.WriteLine("{0} on {1:MM-dd-yyyy HH:mm} by {2} ({3} change(s))", changeset.ChangesetId, changeset.CreationDate, changeset.Owner, changeset.Changes.Length);
foreach (var change in changeset.Changes)
{
Console.WriteLine(" [{0}] {1}", change.ChangeType, change.Item.ServerItem);
}
Console.WriteLine("-----");
Console.WriteLine();
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
sw.Close();
}
}
Console.SetOut(tmp);
Console.WriteLine("Press any key to exit...");
Console.ReadKey();
}</code></pre>
Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-64721804871651849232015-11-11T14:51:00.001-06:002015-11-11T14:51:56.746-06:00Branching<p>I had a conversation with a co-worker yesterday about when to create branches in TFS and the conversation reflected a confusion surrounding the use of branches. The coworker suggested that a new branch should be created each time code was being pushed to any environment because it’s the only way you can be completely sure the branch isn’t polluted.</p> <p>To get the obvious out of the way, if a source control system “pollutes” a branch with no human intervention, get a new source control system. The system has failed at its most basic task. It is very likely this is not the case. It is probably how you are executing your branching and merging strategy.</p> <p>Let’s take a typical branching structure: Dev > QA > Prod; Dev is the parent of QA, which is the parent of Prod. Changes are merged from Dev into QA, then from QA into Prod. <strong>You should never get merge conflicts when going from Dev to QA, or QA to Prod</strong>. Because of this, merging is clean and no “pollution” can happen.</p> <p>How is this possible?</p> <p>The only way a merge conflict happens is when a change has occurred in the target branch you are merging into which has not been integrated in the source branch. But - and this is the critical point - if you are following <a href="http://aka.ms/treasure18">good merging practices</a>, if a change must be made in QA or Prod, it is <em><strong>immediately</strong></em> merged into the parent branch(es). No exceptions! If I make a change in the QA branch, my next immediate check-in is <em>a merge to the Dev branch from QA</em>. I will resolve any merge conflicts with this merge, ensuring that my change is properly integrated in Dev. The next time Dev is merged into QA, it will already have this change, and so no merge conflicts will occur.</p> Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-82031083392171499562014-05-14T17:09:00.000-05:002015-07-06T16:38:44.299-05:00PowerShell Copy-Item RecursionI ran into something interesting today when doing a recursive copy in PowerShell. The “*” character makes a huge difference when you are copying to an existing directory with files.<br />
<br />
For example, if I have a d:\temp\files directory that I want to copy to d:\temp\files2, I can do so by doing:<br />
<br />
<blockquote class="tr_bq">
<span style="font-family: Courier New, Courier, monospace;">copy-item d:\temp\files d:\temp\files2 –recurse –force</span></blockquote>
<br />
However, run that same line again after changing a file or two and though it appears to copy, it won’t actually copy the files. In order for this to happen, you must include a * at the end of the path:<br />
<br />
<blockquote class="tr_bq">
<span style="font-family: Courier New, Courier, monospace;">copy-item d:\temp\files\* d:\temp\files2 –recurse –force</span></blockquote>
Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-80613159200299977022013-07-24T17:17:00.000-05:002015-07-06T16:36:26.422-05:00Windows Server 2012 and Installation MediaOn Windows Server 2012, the installation of the server does not include all of the usual components necessary to support activating some of the server features and roles, such as the Application Server role with the .NET 3.5 framework. The server will give you a warning about missing components and ask for a location of the installation media. If you have the ISO or disk for the installation, you will find these components under <b>[drive]:\sources\sxs</b>.<br />
<br />
I have to say I am not a fan of this model. Early versions of both client and server kept the components separate, requiring you to have the original media every time you wanted to add a Windows feature. Recent versions have solved this by including them, but certainly increased the size on disk. I’d prefer the components to be copied, to avoid the necessity of installation media, and this can be done by manually copying the above directory to a location on the disk. Why did Microsoft change this from being automatic?Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-22980341404223222013-07-22T18:49:00.000-05:002015-07-06T16:34:41.493-05:00Windows InTune and User ManagementThe addresses for Windows InTune and the user management online applications are not similar or on the same domain root, but are important to know if you wish to create users that InTune will then assign to devices. Unfortunately, these are not obvious or easy to find. So, in order for me to remember where they are when I am looking for them again in a few months, here they are.<br />
<br />
If you wish to log into Windows InTune administration, you need to go to <a href="http://manage.microsoft.com/">http://manage.microsoft.com</a>.<br />
<br />
If you wish to log into Intune User Administration, you need to go to <a href="http://account.manage.microsoft.com/">http://account.manage.microsoft.com</a>.<br />
<br />
If you wish to manage your profile, you need to go to <a href="http://portal.microsoftonline.com/">http://portal.microsoftonline.com/</a>.<br />
<br />
If you wish to view the company portal, you need to go to <a href="http://portal.manage.microsoft.com/">http://portal.manage.microsoft.com</a>.<br />
<br />
Not confusing at all…Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-19759737662864088802013-06-11T16:04:00.000-05:002015-07-06T16:31:52.425-05:00Zip in .NET 4.5I was trying to zip a file today in .NET 4.5 and after looking around on the internet for a bit, realized a lot of the code was for the beta of the 4.5 framework and not the final version. There were some API changes between the two and so the process changed a bit. Here is a sample of how to zip a file in .NET 4.5:
<pre class="brush: csharp; title: ; notranslate" title="">
// create a zip file
var zipFilePath = "c:\\myfile.zip");
using (var zipFile = new FileStream(zipFilePath, FileMode.CreateNew))
{
using (var archive = new ZipArchive(zipFile, ZipArchiveMode.Create, false))
{
var dbZip = archive.CreateEntry("log.txt", CompressionLevel.Optimal);
using (var writer = new BinaryWriter(dbZip.Open()))
{
writer.Write(File.ReadAllBytes("c:\\log.txt")); // here is where we read the file into the zip
writer.Close();
}
}
zipFile.Close();
}
</pre>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-5710051732293852912013-05-14T12:24:00.000-05:002015-07-06T16:29:48.096-05:00The backups, the backups, oh the… wait, I don’t have them… DOH!I know I should backup everything regularly. I work in software. Really, I have no excuse. I’ve read <a href="http://www.hanselman.com/blog/TheComputerBackupRuleOfThree.aspx">several </a><a href="http://www.hanselman.com/blog/AutomaticallyBackupYourGmailAccountOnAScheduleWithGMVaultAndWindowsTaskScheduler.aspx">of </a><a href="http://www.hanselman.com/blog/BACKUPYOURCRAPMissingOperatingSystemBackupsDiskImagesHomeServersBootRecBootMgrRebuildBCDFixBootAndProblemsPlural.aspx">Scott </a><a href="http://www.hanselman.com/blog/OnLosingDataAndAFamilyBackupStrategy.aspx">Hanselman’s </a><a href="http://www.hanselman.com/blog/ABasicNoncloudbasedPersonalBackupStrategy.aspx">posts </a>on setting up regular backups. I have some (manual) backups that I do on occasion, but nothing regular and certainly not comprehensive. A few weeks ago, the blog hosting service canceled my account, they did not keep any backups (why would they?) and I did not have any backups of my blog posts. Shame on me! And so I am here, attempting to recover some of the posts through Google’s cache and <a href="http://archive.org/">the internet archive</a>, with limited success. The images are gone, but at least the text content is available.<br />
<br />
The main lesson I have learned through this is that backups must be automated if they are going to regularly happen. A manual backup is nice, but I forget, I get busy, I tell myself I’ll do it tomorrow, or any number of excuses. The tools exist today for automating most backups. I have no excuse. Neither do you.<br />
<br />
I echo <a href="http://www.hanselman.com/blog/TheComputerBackupRuleOfThree.aspx">Scott’s recommendation of having at least three backup locations</a>. I’ve spent the last several days setting these up, with cloud storage being one, a local NAS (network attached storage) drive being the second, and an external hard drive being a third. I’m still in the process of doing this. I will also be setting up services that run daily/weekly that will backup my mail, blog posts, and any content that regularly changes. By the end of this, I’ll likely be backup crazy, but next time, I won’t lose content!Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-2086568916239330092012-12-18T09:22:00.000-06:002015-07-06T16:26:49.520-05:00Webinar on Windows 8 App DevelopmentI recently did a webinar on some of the features that Windows 8 Store App developers can easily add to their apps to enhance the user experience. You can <a href="http://youtu.be/B0yC5Z303pE">view the webinar here</a>. I specifically covered Semantic Zoom, Live Tiles, Snap, and Search integration with the Windows 8 charms bar.<br />Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-82938671494059226882012-11-16T12:33:00.000-06:002015-07-06T16:54:37.197-05:00SQL Server Database Design GotchasThis is a long over-due post about some SQL Server database design gotchas that I have run across as I’ve worked with SQL Server over the years. A couple of these are already “best practices” and the others are simply practices I’ve learned that improve the database design.<br />
<br />
1. Don’t use UniqueIdentifier (GUID) as the clustered key of a table.<br />
<br />
This results in massive table fragmentation, as the order of the data in the table will be based on this column. Because GUIDs are by nature random and not sequential, this means that inserted data will always be randomly inserted all over the place. It is best to choose a sequential column or set of columns as the clustered key of a table.<br />
<br />
2. Index your foreign keys.<br />
<br />
SQL Server does not index foreign keys and thus you must do this manually. This is a good practice because you will be joining tables on the primary/foreign key relationships and therefore having an index on the foreign keys will allow SQL Server to use the index on those joins. If you follow point #3 on your database design, you can use the following SQL to generate your foreign keys (SQL 2008+):<br />
<br />
<pre class="brush: sql; title: ; notranslate" title="">SELECT 'CREATE NONCLUSTERED INDEX IDXFK_'+SCHEMA_NAME([d].schema_id)+'_'+[d].[name] +'_'+ OBJECT_NAME(a.referenced_object_id) + '_' + [c].[name] + ' ON '+ SCHEMA_NAME([d].schema_id)+'.'+[d].[name] + '('+[c].[name]+');'
FROM sys.foreign_keys a
INNER JOIN sys.foreign_key_columns b ON a.parent_object_id = b.parent_object_id AND a.referenced_object_id = b.referenced_object_id AND a.object_id = b.constraint_object_id
INNER JOIN sys.columns c ON a.parent_object_id = c.object_id AND b.parent_column_id = c.column_id
INNER JOIN sys.tables d ON c.object_id = d.object_id
ORDER BY [d].[name], [c].[name]
</pre>
<br />
<br />
3. Don’t create nullable BIT columns.<br />
<br />
This is a logical error anyway. A bit (or boolean) by definition is either true or false. If there is a third option, use another data type like tinyint. And create a default constraint on the bit column to save yourself on inserts.<br />
<br />
4. Use a single column for your primary key (avoid composite primary keys).<br />
<br />
The benefits to this one are multiple. One column is the identifier of the row, any foreign keys back to the table are also one column, the join in simpler, the index is smaller, etc. With composite keys, you can quickly end up in a situation where the great-grandchild table ends up having 4 columns as its primary key, and who likes to type that much code for joins?<br />
<br />
5. Use unique constraints to specify the business keys (corollary to the previous point).<br />
<br />
While the primary key is a single column, use a unique constraint to specify the business key instead of defining the primary key as the business key.<br />
<br />
6. Be consistent in your naming.<br />
<br />
Self-explanatory – it is difficult to maintain a database that is inconsistent. Even when making changes to an existing database, stick with the convention already defined instead of doing your own, even if you disagree with its convention.<br />
<br />
7. Avoid making the primary key column of a lookup table an identity column if the table is one where the primary key value will have meaning.<br />
<br />
For system lookup tables (typically those you would generate as enums in code), I’d recommend not applying an identity specification on the key column (thus requiring an explicit set of the value on insert) and I’d change the column name to end in “Cd” instead of “Id” to denote that it is an explicit value that can be coded against.<br />
<br />
That’s it for now. Enjoy!Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-44295101280336256272012-08-21T14:43:00.000-05:002015-07-06T16:58:30.021-05:00Installing Windows 8 Enterprise and Activation ErrorAfter installing Windows 8 Enterprise edition, I received the following error:<br />
<br />
<span style="color: red;">Windows can’t activate right now. Please try again later.</span><br />
<br />
Trying later results in the same message. The problem turns out to be that the system need a product key. To do this, simply run an elevated command prompt and type:<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">slmgr.vbs –ipk “ENTER PRODUCT KEY”</span><br />
<br />
Once this is done, you will be able to activate Windows (and on mine, it was already activated when I went to the activation center).Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-88024750269299778162012-08-07T17:33:00.000-05:002015-07-06T17:01:04.099-05:00Delimited List of ColumnsI frequently need to get a list of columns of a database table in a delimited format, and have found myself rediscovering the following query:<br />
<br />
<pre class="brush: sql; title: ; notranslate">select '['+ [name] + '],'
from sys.columns
where object_name(object_id) = 'table-name'
order by column_id
for xml path('')
</pre>
<br />
<br />
For a table with three columns, the return from the above would be:<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">[Column1],[Column2],[Column3],</span><br />
<br />
Enjoy!Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-4382587956208514142012-07-05T15:30:00.000-05:002015-07-06T17:26:44.201-05:00Creating Scripts to Automate a Local RebuildAs a project grows larger, it tends to take a longer time to pull down the latest from source code, recompile all the code, and if required, redeploy the database to your local machine so you have the latest of everything. A few weeks ago a coworker remarked that he would like to run a script that would clean, get latest, rebuild everything, and reset his database to the latest from source control, while he goes and gets coffee (our project/local ‘reset’ is a multi-step process that takes about 15 minutes to run). It was a great idea, and so I went off and created something that is now in use by the team, and something you may find useful in yours. Let’s walk through it.<br />
<br />
Using an MSBuild build file, I created targets for each action: Clean, GetLatest, Compile, BuildDatabase, and DeployDatabase. At the top of the file there are properties (PropertyGroups and ItemGroups) defined that provide some “configuration” information for what will be run. Notice that the DeployDatabase target explicitly depends on the BuildDatabase target; the other targets do not have dependencies because I want them to be able to run separately.<br />
<br />
<pre class="brush:xml;"><project defaulttargets="Compile" toolsversion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<propertygroup>
<!-- general global properties -->
<configuration condition="'$(Configuration)' == ''">Debug</configuration>
<platform condition="'$(Platform)' == ''">Any CPU</platform>
<runcodeanalysis>false</runcodeanalysis>
<skipinvalidconfigurations>true</skipinvalidconfigurations>
<databasename>YourDBName</databasename>
<dropdatabasesql>"if exists(select top 1 1 from sys.databases where [name] = '$(DatabaseName)') BEGIN ALTER DATABASE $(DatabaseName) SET SINGLE_USER WITH ROLLBACK IMMEDIATE;drop database $(DatabaseName);END"</dropdatabasesql>
<databaseserver>.\sqlexpress</databaseserver>
<databasesqlcmd>sqlcmd -E -S $(DatabaseServer) -Q $(DropDatabaseSQL)</databasesqlcmd>
</propertygroup>
<itemgroup>
<!-- these represent the paths in TFS source control, relative to this build file, of the paths that need the GET LATEST operation applied -->
<getpath include="$(MSBuildProjectDirectory)\Path1">
<getpath include="$(MSBuildProjectDirectory)\Path2">
</getpath></getpath></itemgroup>
<propertygroup>
<!-- It appears MSBuild does not pass in every property defined above, so instead of repeating these every time, this creates one consolidated property that just needs to be kept up to date-->
<explicitrequiredproperties>RunCodeAnalysis=$(RunCodeAnalysis);SkipInvalidConfigurations=true;RestorePackages=false;</explicitrequiredproperties>
</propertygroup>
<itemgroup>
<!-- add a line in here for each solution you wish to build -->
<solutionprojectstobuild include="YourSolution.sln">
</solutionprojectstobuild></itemgroup>
<itemgroup>
<!-- add a line in here for each database you wish to build -->
<databaseprojectstobuild include="yourdbproject.dbproj">
</databaseprojectstobuild></itemgroup>
<target name="Clean">
<msbuild projects="@(SolutionProjectsToBuild)" targets="Clean">
</msbuild></target>
<target name="GetLatest">
<exec command="tf get %22%(GetPath.Identity)%22 /recursive">
</exec></target>
<target name="Compile">
<msbuild buildinparallel="true" projects="@(SolutionProjectsToBuild)" properties="$(ExplicitRequiredProperties)">
</msbuild></target>
<target name="BuildDatabase">
<msbuild projects="@(DatabaseProjectsToBuild)" properties="$(ExplicitRequiredProperties)" targets="Build">
</msbuild></target>
<target dependsontargets="BuildDatabase" name="DeployDatabase">
<exec command="$(DatabaseSqlCmd)">
<msbuild projects="@(DatabaseProjectsToBuild)" properties="$(ExplicitRequiredProperties)" targets="Deploy">
</msbuild></exec></target>
</project>
</pre>
<br />
The items you will need to configure are the DatabaseName, GetPath, SolutionProjectsToBuild, and DatabaseProjectsToBuild values. Once this is done, you should place this file near or at the root of your project (the location of the file will be the MSBuildProjectDirectory value). You can then execute MSBuild, calling the targets, or chaining them together. You should run this from a Visual Studio Command Prompt, as it uses TF.exe. This command is part of the TFS Power Tools, so ensure that you have it installed.<br />
<br />
For our project, I created a .bat file that contains the following (note my build file is called master.build):<br />
<br />
call “C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\vcvarsall.bat” x86<br />
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:GetLatest<br />
<br />
I have defined additional targets in my build file to run multiple steps; here is an example of one:<br />
<pre class="brush:xml;">
<Target Name="CleanGetLatest">
<CallTarget Targets="Clean;GetLatest"/>
</Target>
</pre>
<br />
If I want to execute two steps at the same time, in my bat file, I do the following (deploy database pops up in a second command window):<br />
<pre class="brush:none;">
cmd /c start cmd /k “C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:DeployDatabase /m:2″
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:Compile /m:2
</pre>
<br />
And since the team wanted options, I did the following in my bat file:
<pre class="brush:none;">
call "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\vcvarsall.bat" x86
cls
echo Please select one of the following options:
echo 1. Get Latest, Compile, and Deploy DB (default in 10 seconds)
echo 2. Get Latest and Compile only
echo 3. Compile only
echo 4. Deploy Database Only
echo -
choice /C:1234 /N /D:1 /T:10 /M:"Your selection >> "
if %ERRORLEVEL% == 1 GOTO FULL
if %ERRORLEVEL% == 2 GOTO GETCOMPILE
if %ERRORLEVEL% == 3 GOTO COMPILEONLY
if %ERRORLEVEL% == 4 GOTO DATABASEONLY
GOTO END
:FULL
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:CleanGetLatest
cmd /c start cmd /k "C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:DeployDatabase /m:2"
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:Compile /m:2
GOTO END
:GETCOMPILE
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:CleanGetLatest
cmd /c start cmd /k "C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:BuildDatabase /m:2"
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:Compile /m:2
GOTO END
:COMPILEONLY
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:Clean /m:2
cmd /c start cmd /k "C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:BuildDatabase /m:2"
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:Compile /m:2
GOTO END
:DATABASEONLY
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Msbuild.exe master.build /t:DeployDatabase /m:2
GOTO END
:END
echo on
Pause
</pre>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-88611359612043637712012-06-06T12:34:00.000-05:002015-07-06T17:29:15.922-05:00Runas /netonlyI discovered the runas /netonly gem today and it is wonderful! Yes, wonderful!<br />
<br />
The runas command allows you to run a program on a machine as a different user than what you are logged in as. This is great when you are doing same domain activities, but what about crossing domains? This is the issue the /netonly switch solves. If you include this switch, it will run the program as your logged in user, but any network calls will be sent as if they came from the user you specified! Thus, if I execute something like:<br />
<br />
runas /netonly /user:AnotherDomain\AnotherUser devenv<br />
<br />
This runs Visual Studio as my user, but any network calls (TFS, Database, etc.) will use AnotherDomain\AnotherUser. A SQL server with Windows Authentication only can now be connected to via this command on your machine not on the same domain as the SQL server. You can run code locally in Visual Studio and debug, connecting via Windows Authentication to a server on a different domain.<br />
<br />
Yes, this is wonderful!Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-13305533134696159862012-05-08T16:30:00.000-05:002015-07-06T17:31:07.313-05:00Config Transforms for ElementsI would have thought this was obvious, but it took me a bit to figure it out. The config file transforms that are available <a href="http://msdn.microsoft.com/en-us/library/dd465326.aspx">for web.config files</a> and for <a href="http://visualstudiogallery.msdn.microsoft.com/69023d00-a4f9-4a34-a6cd-7e854ba318b5">all other files</a> can be used to replace sections of a config file, based on a project configuration. Most of the examples show changing attributes. I wanted to change the entire element, in this instance, connectionStrings. To do this, you simply put the Replace value in the Transform attribute on the element:
<br />
<pre class="brush:xml"><connectionstrings xdt:transform="Replace">
</connectionstrings></pre>
Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-67382952672790535652012-05-06T11:43:00.000-05:002015-07-06T17:32:55.492-05:00Troubleshooting Build FailuresWe are having random build failures which have been a huge pain to troubleshoot (although we know that it has something to do with multi-core builds), so I wrote a quick Powershell script today that will run the build against the solution repeatedly until the build fails.
<br />
<pre class="brush: powershell; title: ; notranslate">while($true)
{
C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe /nologo /target:Rebuild YourSolutionFile.sln /p:SkipInvalidConfigurations=true /p:DeployOnBuild=False /p:Configuration=Debug /p:RestorePackages=false /m:2 /p:OutDir="E:\play\Binaries\\" | Out-Host
if ($LastExitCode -ne 0)
{
break;
}
}
</pre>
Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-56508817914242779342012-04-10T13:34:00.000-05:002015-07-06T17:35:49.624-05:00Crossing Domains without Password PainI work in consulting, and as such, my machine is never on the client domain. This causes some headache because every time I want to connect to a client resource, I am prompted for my client username and password. By default, the client’s resources are in the Internet zone, so nothing is trusted, nor are my saved passwords used to authenticate – I am asked to reenter my password each time I connect to any resource. Fortunately, there is an easy solution – simply add the domain as a Local Intranet site in your Internet settings. Under Internet Options, Security, Local Intranet, Sites, Advanced, you can enter the paths to the client resource that you commonly access. Once this is done, after you check the “Save Password” option when you access a resource again, you won’t have to reenter the password again.<br />
<br />
If the client has a TFS instance, you can save your username and password used to access it by going to the team web access site in Internet Explorer and saving your credentials when prompted. When you launch Visual Studio and connect to that TFS instance, it will then use your saved credentials.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-4460663817180823101.post-74831959827242977932012-04-09T11:16:00.000-05:002015-07-06T17:40:14.195-05:00Setting up a new IIS Server for ASP.NET or MVCWhen setting up a new Windows Server for hosting ASP.NET or MVC applications, I have several Powershell scripts that I run to modify some of the default IIS settings. You can also modify the IIS settings manually, but don’t fear the command line – it is your friend.<br />
<br />
Powershell has <a href="http://technet.microsoft.com/en-us/library/ee790599.aspx">an IIS module</a> that you will need to import to run most of these commands – WebAdministration. Now for the first set of scripts:<br />
<pre class="brush: powershell; title: ; notranslate" title="">Import-Module WebAdministration
#expire web content after 30 days
Set-WebConfigurationProperty -filter "/system.webServer/staticContent/clientCache" -name cacheControlMode -value "UseMaxAge"
Set-WebConfigurationProperty -filter "/system.webServer/staticContent/clientCache" -name cacheControlMaxAge -value "30.00:00:00"
# change logging to include two more properties
Set-WebConfigurationProperty -filter "/system.applicationHost/sites/siteDefaults/logFile" -name logExtFileFlags -value "Date, Time, ClientIP, UserName, ServerIP, Method, UriStem, UriQuery, HttpStatus, Win32Status, BytesSent, BytesRecv, TimeTaken, ServerPort, UserAgent, HttpSubStatus"
# change the IIS server's header value to from value -- applies to ENTIRE SERVER
$computer = gc env:computername
Set-WebConfiguration -filter "/system.webServer/httpProtocol/customHeaders/add[@value='ASP.NET']/@name" -value "From"
Set-WebConfiguration -filter "/system.webServer/httpProtocol/customHeaders/add[@name='From']/@value" -value $computer
</pre>
<br />
The above scripts are mostly self-explanatory – adjusting logging, static caching, and making sure the HTTP header of the sites on the box will include the box name. This is especially useful in load-balanced scenarios, when you need to troubleshoot an errant server.<br />
<br />
The next script modifies IIS to allow anonymous and windows authentication to be set in the web.config of child applications.<br />
<pre class="brush: powershell; title: ; notranslate" title=""># change the master IIS config file to allow override of anonymous and windows auth
[xml]$config = Get-Content C:\Windows\System32\inetsrv\config\applicationHost.config
$config.selectSingleNode("/configuration/configSections/sectionGroup[@name='system.webServer']/sectionGroup[@name='security']/sectionGroup[@name='authentication']/section[@name='anonymousAuthentication']").SetAttribute("overrideModeDefault", "Allow")
$config.selectSingleNode("/configuration/configSections/sectionGroup[@name='system.webServer']/sectionGroup[@name='security']/sectionGroup[@name='authentication']/section[@name='windowsAuthentication']").SetAttribute("overrideModeDefault", "Allow")
$config.Save("C:\Windows\System32\inetsrv\config\applicationHost.config")
</pre>
<br />
By default IIS does not allow child applications to define their own authentication. You can change a site’s security policy in the IIS manager, but this modifies the security settings in the applicationHost.config file instead of the web.config of the application. You can allow the local site’s web.config to define this with the script below:<br />
<br />
And finally, I prefer IIS to be clear of any default sites and application pools before I start adding my own, so I remove them (<span style="color: red;">Warning: this will clear all sites and application pools from a server</span>):
<br />
<pre class="brush: powershell; title: ; notranslate" title=""># RESET IIS environment
Remove-Item 'IIS:\AppPools\*' -Recurse
Remove-Item 'IIS:\Sites\*' -Recurse
</pre>
Unknownnoreply@blogger.com0