using (TextWriter tmp = Console.Out)
{
using (var fs = new FileStream("Test.txt", FileMode.Create))
{
using (var sw = new StreamWriter(fs))
{
Console.SetOut(sw);
var workItemIds = new int[] { 1,2,3 };
var collectionUri = new Uri("http://yourtfs/server/");
try
{
using (var tpc = new TfsTeamProjectCollection(collectionUri))
{
var workItemStore = tpc.GetService<WorkItemStore>();
var teamProject = workItemStore.Projects["project-name"];
var versionControlServer = tpc.GetService<VersionControlServer>();
var artifactProvider = versionControlServer.ArtifactProvider;
var workItems = workItemStore.Query(workItemIds, "Select [System.Id], [System.Title] from WorkItems");
var allChangesets = new List<Changeset>();
foreach (WorkItem workItem in workItems)
{
allChangesets.AddRange(
workItem.Links.OfType<ExternalLink>().Select(link => artifactProvider.GetChangeset(new Uri(link.LinkedArtifactUri)))
);
}
var orderedChangesets = allChangesets.OrderByDescending(c => c.CreationDate).ToArray();
foreach (var changeset in orderedChangesets)
{
Console.WriteLine("{0} on {1:MM-dd-yyyy HH:mm} by {2} ({3} change(s))", changeset.ChangesetId, changeset.CreationDate, changeset.Owner, changeset.Changes.Length);
foreach (var change in changeset.Changes)
{
Console.WriteLine(" [{0}] {1}", change.ChangeType, change.Item.ServerItem);
}
Console.WriteLine("-----");
Console.WriteLine();
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
sw.Close();
}
}
Console.SetOut(tmp);
Console.WriteLine("Press any key to exit...");
Console.ReadKey();
}
Contains notes and lessons in working technology, especially .NET, Azure, DevOps, Agile, and Team Foundation Server.
Friday, November 20, 2015
Getting all changesets associated to work items
Wednesday, November 11, 2015
Branching
I had a conversation with a co-worker yesterday about when to create branches in TFS and the conversation reflected a confusion surrounding the use of branches. The coworker suggested that a new branch should be created each time code was being pushed to any environment because it’s the only way you can be completely sure the branch isn’t polluted.
To get the obvious out of the way, if a source control system “pollutes” a branch with no human intervention, get a new source control system. The system has failed at its most basic task. It is very likely this is not the case. It is probably how you are executing your branching and merging strategy.
Let’s take a typical branching structure: Dev > QA > Prod; Dev is the parent of QA, which is the parent of Prod. Changes are merged from Dev into QA, then from QA into Prod. You should never get merge conflicts when going from Dev to QA, or QA to Prod. Because of this, merging is clean and no “pollution” can happen.
How is this possible?
The only way a merge conflict happens is when a change has occurred in the target branch you are merging into which has not been integrated in the source branch. But - and this is the critical point - if you are following good merging practices, if a change must be made in QA or Prod, it is immediately merged into the parent branch(es). No exceptions! If I make a change in the QA branch, my next immediate check-in is a merge to the Dev branch from QA. I will resolve any merge conflicts with this merge, ensuring that my change is properly integrated in Dev. The next time Dev is merged into QA, it will already have this change, and so no merge conflicts will occur.
Wednesday, May 14, 2014
PowerShell Copy-Item Recursion
For example, if I have a d:\temp\files directory that I want to copy to d:\temp\files2, I can do so by doing:
copy-item d:\temp\files d:\temp\files2 –recurse –force
However, run that same line again after changing a file or two and though it appears to copy, it won’t actually copy the files. In order for this to happen, you must include a * at the end of the path:
copy-item d:\temp\files\* d:\temp\files2 –recurse –force
Wednesday, July 24, 2013
Windows Server 2012 and Installation Media
I have to say I am not a fan of this model. Early versions of both client and server kept the components separate, requiring you to have the original media every time you wanted to add a Windows feature. Recent versions have solved this by including them, but certainly increased the size on disk. I’d prefer the components to be copied, to avoid the necessity of installation media, and this can be done by manually copying the above directory to a location on the disk. Why did Microsoft change this from being automatic?
Monday, July 22, 2013
Windows InTune and User Management
If you wish to log into Windows InTune administration, you need to go to http://manage.microsoft.com.
If you wish to log into Intune User Administration, you need to go to http://account.manage.microsoft.com.
If you wish to manage your profile, you need to go to http://portal.microsoftonline.com/.
If you wish to view the company portal, you need to go to http://portal.manage.microsoft.com.
Not confusing at all…
Tuesday, June 11, 2013
Zip in .NET 4.5
// create a zip file
var zipFilePath = "c:\\myfile.zip");
using (var zipFile = new FileStream(zipFilePath, FileMode.CreateNew))
{
using (var archive = new ZipArchive(zipFile, ZipArchiveMode.Create, false))
{
var dbZip = archive.CreateEntry("log.txt", CompressionLevel.Optimal);
using (var writer = new BinaryWriter(dbZip.Open()))
{
writer.Write(File.ReadAllBytes("c:\\log.txt")); // here is where we read the file into the zip
writer.Close();
}
}
zipFile.Close();
}
Tuesday, May 14, 2013
The backups, the backups, oh the… wait, I don’t have them… DOH!
The main lesson I have learned through this is that backups must be automated if they are going to regularly happen. A manual backup is nice, but I forget, I get busy, I tell myself I’ll do it tomorrow, or any number of excuses. The tools exist today for automating most backups. I have no excuse. Neither do you.
I echo Scott’s recommendation of having at least three backup locations. I’ve spent the last several days setting these up, with cloud storage being one, a local NAS (network attached storage) drive being the second, and an external hard drive being a third. I’m still in the process of doing this. I will also be setting up services that run daily/weekly that will backup my mail, blog posts, and any content that regularly changes. By the end of this, I’ll likely be backup crazy, but next time, I won’t lose content!