Friday, November 20, 2015

Getting all changesets associated to work items

I had a need today to pull a list of all check-ins that had been associated to a certain list of work items. This can be done easily using the TFS API assemblies, most of which are located in C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0. The code below will write the list of all file changes across all changesets for the specified work item IDs.
using (TextWriter tmp = Console.Out)
            {
                using (var fs = new FileStream("Test.txt", FileMode.Create))
                {
                    using (var sw = new StreamWriter(fs))
                    {
                        Console.SetOut(sw);

                        var workItemIds = new int[] { 1,2,3 };
                        var collectionUri = new Uri("http://yourtfs/server/");

                        try
                        {
                            using (var tpc = new TfsTeamProjectCollection(collectionUri))
                            {
                                var workItemStore = tpc.GetService<WorkItemStore>();
                                var teamProject = workItemStore.Projects["project-name"];
                                var versionControlServer = tpc.GetService<VersionControlServer>();
                                var artifactProvider = versionControlServer.ArtifactProvider;
                                var workItems = workItemStore.Query(workItemIds, "Select [System.Id], [System.Title] from WorkItems");
                                var allChangesets = new List<Changeset>();

                                foreach (WorkItem workItem in workItems)
                                {
                                    allChangesets.AddRange(
                                        workItem.Links.OfType<ExternalLink>().Select(link => artifactProvider.GetChangeset(new Uri(link.LinkedArtifactUri)))
                                        );
                                }

                                var orderedChangesets = allChangesets.OrderByDescending(c => c.CreationDate).ToArray();
                                foreach (var changeset in orderedChangesets)
                                {
                                    Console.WriteLine("{0} on {1:MM-dd-yyyy HH:mm} by {2} ({3} change(s))", changeset.ChangesetId, changeset.CreationDate, changeset.Owner, changeset.Changes.Length);
                                    foreach (var change in changeset.Changes)
                                    {
                                        Console.WriteLine("    [{0}] {1}", change.ChangeType, change.Item.ServerItem);
                                    }
                                    Console.WriteLine("-----");
                                    Console.WriteLine();
                                }
                            }
                        }
                        catch (Exception ex)
                        {
                            Console.WriteLine(ex.Message);
                        }

                        sw.Close();
                    }
                }
                Console.SetOut(tmp);
                Console.WriteLine("Press any key to exit...");
                Console.ReadKey();
            }

Wednesday, November 11, 2015

Branching

I had a conversation with a co-worker yesterday about when to create branches in TFS and the conversation reflected a confusion surrounding the use of branches. The coworker suggested that a new branch should be created each time code was being pushed to any environment because it’s the only way you can be completely sure the branch isn’t polluted.

To get the obvious out of the way, if a source control system “pollutes” a branch with no human intervention, get a new source control system. The system has failed at its most basic task. It is very likely this is not the case. It is probably how you are executing your branching and merging strategy.

Let’s take a typical branching structure: Dev > QA > Prod; Dev is the parent of QA, which is the parent of Prod. Changes are merged from Dev into QA, then from QA into Prod. You should never get merge conflicts when going from Dev to QA, or QA to Prod. Because of this, merging is clean and no “pollution” can happen.

How is this possible?

The only way a merge conflict happens is when a change has occurred in the target branch you are merging into which has not been integrated in the source branch. But - and this is the critical point - if you are following good merging practices, if a change must be made in QA or Prod, it is immediately merged into the parent branch(es). No exceptions! If I make a change in the QA branch, my next immediate check-in is a merge to the Dev branch from QA. I will resolve any merge conflicts with this merge, ensuring that my change is properly integrated in Dev. The next time Dev is merged into QA, it will already have this change, and so no merge conflicts will occur.

Wednesday, May 14, 2014

PowerShell Copy-Item Recursion

I ran into something interesting today when doing a recursive copy in PowerShell. The “*” character makes a huge difference when you are copying to an existing directory with files.

For example, if I have a d:\temp\files directory that I want to copy to d:\temp\files2, I can do so by doing:

copy-item d:\temp\files d:\temp\files2 –recurse –force

However, run that same line again after changing a file or two and though it appears to copy, it won’t actually copy the files. In order for this to happen, you must include a * at the end of the path:

copy-item d:\temp\files\* d:\temp\files2 –recurse –force

Wednesday, July 24, 2013

Windows Server 2012 and Installation Media

On Windows Server 2012, the installation of the server does not include all of the usual components necessary to support activating some of the server features and roles, such as the Application Server role with the .NET 3.5 framework.  The server will give you a warning about missing components and ask for a location of the installation media.  If you have the ISO or disk for the installation, you will find these components under [drive]:\sources\sxs.

I have to say I am not a fan of this model.  Early versions of both client and server kept the components separate, requiring you to have the original media every time you wanted to add a Windows feature.  Recent versions have solved this by including them, but certainly increased the size on disk.  I’d prefer the components to be copied, to avoid the necessity of installation media, and this can be done by manually copying the above directory to a location on the disk.  Why did Microsoft change this from being automatic?

Monday, July 22, 2013

Windows InTune and User Management

The addresses for Windows InTune and the user management online applications are not similar or on the same domain root, but are important to know if you wish to create users that InTune will then assign to devices.  Unfortunately, these are not obvious or easy to find.  So, in order for me to remember where they are when I am looking for them again in a few months, here they are.

If you wish to log into Windows InTune administration, you need to go to http://manage.microsoft.com.

If you wish to log into Intune User Administration, you need to go to http://account.manage.microsoft.com.

If you wish to manage your profile, you need to go to http://portal.microsoftonline.com/.

If you wish to view the company portal, you need to go to http://portal.manage.microsoft.com.

Not confusing at all…

Tuesday, June 11, 2013

Zip in .NET 4.5

I was trying to zip a file today in .NET 4.5 and after looking around on the internet for a bit, realized a lot of the code was for the beta of the 4.5 framework and not the final version.  There were some API changes between the two and so the process changed a bit.  Here is a sample of how to zip a file in .NET 4.5:
// create a zip file 
var zipFilePath = "c:\\myfile.zip");
 
using (var zipFile = new FileStream(zipFilePath, FileMode.CreateNew))
{
    using (var archive = new ZipArchive(zipFile, ZipArchiveMode.Create, false))
    {
        var dbZip = archive.CreateEntry("log.txt", CompressionLevel.Optimal);
 
        using (var writer = new BinaryWriter(dbZip.Open()))
        {
            writer.Write(File.ReadAllBytes("c:\\log.txt")); // here is where we read the file into the zip
            writer.Close();
        }
    }
    zipFile.Close();
}

Tuesday, May 14, 2013

The backups, the backups, oh the… wait, I don’t have them… DOH!

I know I should backup everything regularly.  I work in software.  Really, I have no excuse.  I’ve read several of Scott Hanselman’s posts on setting up regular backups.  I have some (manual) backups that I do on occasion, but nothing regular and certainly not comprehensive.  A few weeks ago, the blog hosting service canceled my account, they did not keep any backups (why would they?) and I did not have any backups of my blog posts.  Shame on me!  And so I am here, attempting to recover some of the posts through Google’s cache and the internet archive, with limited success.  The images are gone, but at least the text content is available.

The main lesson I have learned through this is that backups must be automated if they are going to regularly happen.  A manual backup is nice, but I forget, I get busy, I tell myself I’ll do it tomorrow, or any number of excuses.  The tools exist today for automating most backups.  I have no excuse.  Neither do you.

I echo Scott’s recommendation of having at least three backup locations.  I’ve spent the last several days setting these up, with cloud storage being one, a local NAS (network attached storage) drive being the second, and an external hard drive being a third.  I’m still in the process of doing this.  I will also be setting up services that run daily/weekly that will backup my mail, blog posts, and any content that regularly changes.  By the end of this, I’ll likely be backup crazy, but next time, I won’t lose content!