Monday, December 24, 2007

New TFS releases - in time for holidays

If you happen to have some leasure during holidays, you might want to check out the following:


While those VPC rock, one small note - the TFS server is completely bare (that is you should expect to spend some time to setup projects, build etc. if you are going to use those VPCs for demo).
And they do not have new TFS Power Tools pre-installed. What, you have not heard about it? Yes, you should download VS2008 version of Power Tools. My favorites in this version are files and status search (for all ye SourceSafe faithful, it is very similar to VSS operation - allows one to search by file name and file status) and quick label (allows labeling of the selected files/folders in one click).
And that's not all! There is also new version of MSSCCI provider for your to download!
Check Brian Harry's blog for more information.

And Happy Holidays to all! See you in 2008!

Thursday, December 13, 2007

Offline in VS 2008 (continued)

In a previous post I talked about offline story for TFS and Visual Studio 2008.

And since then, several important additions have come up.

One important note relates to what exactly becomes "offline" - solution you are opening or the whole server. It turns out that the default behavior will label the whole TFS server as being offline, and as a result, when you will open another solution (without first performing "Go online" on the first one), it will be opened offline automatically. To get the whole server back "online", you will have to perform "Go online" on one of the solutions.

And that indirectly may cause a bug that is described in the MSDN forum post - when the solution is opened from Source Control Exlorer and the server is offline, the error message will appear (though I believe the flow described will not be something often used).

The nuances of offline operation are described in Ben Ryan's blog. Additionally, if you are not happy with whole TFS server being labeled offline when single solution is taken offline, Ben also describes the way to change that behavior.

I can only add that I wish Ben Ryan would post more often (less "offline" time for the blog :)- it appears that he has lots of interesting version control stories to tell.

Wednesday, December 12, 2007

Get latest on check-out in TFS 2008 (footnote)

One of the new features in TFS 2008 is the ability to get the latest version on the check out (yes, yes, that's the very same feature that was discussed a zillion times).

The checkbox "Get latest version of item on check out" is located in "Tools->Options" menu, "Source Control->Visual Studio Team Foundation Server" settings tab.

That would be it but for the following fact - you will see the checkbox after you installed VS2008 and Team Explorer 2008, and it will be enabled no matter what TFS server you are using (2008 client will work both with 2005 and 2008 server). But it will work as advertised only if you have TFS2008 server!

So be forewarned - it took me quite some staring at the screen and googling to understand why the latest version did not appear :)

Saturday, December 01, 2007

Choices in conversion of solutions and projects to VS2008

On the weekend I am trying to catch up on my blogroll, and I have found several excellent posts on conversion between VS2005 and VS2008. Now that VS2008 is released, the conversion of code base from previous version will become a common problem, and here is some advise on dealing with that.

Of course simplest way would be just converting all solutions to VS2008 (see below why you do not have to convert projects). Here you have several choices:
- Open VS2005 solution in VS2008 and run the conversion wizard.
- Run VS2008 in command-line converting the solution in place in the background (that should be perfect if you have more than one solution :). Read more about magical command-line "/upgrade" switch in John Robbins blog post.
- Mess around with solution file in text editor. The difference between solution for VS2005 and VS2006 would be only the version specifier("Format Version 9.00" vs. "Format Version 10.00"). The difference in projects would be only MSBuild ToolsVersion attribute (that serves to define .NET toolset project is built with). Since all converted projects use 2.0, that can be easily set. Read in more detail about solution and project formats changes in DJ Park blog post.

Now, if you still need to do some development on VS2005 while doing development in VS2008 in the same solution, the best bet would be probably to create a new solution (e.g. "Solution_3.5.sln"). The projects that are using .NET 2.0 will build both in VS2005 and Orcas.

Personally, I will probably use the magnificent "/upgrade" command-line switch to convert all solutions while keeping in mind that I have an option to tweak the files by hand (sometimes it may come in handy if you need to quickly rollback certain solution to VS2005).

And another conclusion - reading blogs can be very useful and save lots of time :)

Offline and back again in VS2008

One of the most painful issues in VS2005 was its quirky offline support.

One of the first features I have checked in Visual Studio 2008 was working offline. And what can I say - it (mostly) rocks!

Let's do it step-by-step.


  • You open the solution under source control and there is no connection to the server. What you get is the dialog, indicating offline condition

    In Output window pane appears the log message, indicating detailed reason for the offline

    Important note: Once the solution is opened offline, next time you open it in Visual Studio the dialog won't appear (but the output pane will show same message every time)


  • If you choose to change files in your solution/project when is offline, you will have a message that file is read-only once you save the changes. There is no indication of "under source control" status for offline projects or solutions

    Important note: When working offline, you may edit or delete existing files or add new ones, and these types of changes will be supported when going "online" with TFS server. However, file renames are not supported

  • Now let's go online. You open solution in VS and you have connection - but tranistion to online will not happen automatically. To sync solution/project back online you should use "Go Online" menu, that available on right-click in Solution Explorer or in "File"->"Source Control" menu.

    Once you hit "Go Online" requested, the dialog with changes "Go Online" dialog will appear detailing the changes performed while online. The dialog shows files and
    changes (add/edit/delete) performed while offline. You may choose not to pend the changes for specific files (then the file will remain writable, but will not have pending changes).

    Important note: If you did not perform any changes while offline, the appropriate message will appear
    Once online, the pending changes indications will appear in Solution Explorer.

  • If you want to unbind your solution/project from source control, all you need to do is to go to "Source Control"->"Change Source Control" menu. The message will appear asking you whether you want to go online or permanently unbind your solution.



But of course, there were several "Things I did not like":

  1. No cancel on Offline dialog - either go offline or ...?

  2. No indication for source controlled files in Solution Explorer in offline mode (similar to that of VSS) - only read-only flag on file serves as an indication

  3. "Change Source Control" dialog does not allow one to "Disconnect", you can only "Unbind; so going offline happens only when server is not available


Overall, I should say I liked VS2008 experience very much (especially as compared with previous version). I can feel that feedback on TFS2005 offline experience was not wasted!

The one and the only non-technical post

As a rule I post only about things technical and TFS related. This post is not going to be technical. Be forewarned!

I just could not pass it. The guy is absolutely hilarious - every picture is in different style, and I liked enormously the sense of humour and diversity of it. So this being Friday and this blog being my personal blog - I will bust the technical only rule for that once!

Here it is - The Perry Bible Fellowship. Or may be this one? A word of warning though - some of the pictures may be a bit risque so parental advisory is strongly suggested.

Have a nice weekend!

Wednesday, November 21, 2007

Chat with Visual Studio Team System group

Come chat with the Visual Studio Team System group on December 5th!

Join members of the Visual Studio Team System product group to discuss features available in Team Foundation Server, Team Suite, Architecture Edition, Development Edition, Database Edition, and Test Edition. In addition, discuss what's new for these editions for Visual Studio 2008.

There will be two sessions:

Join the chat on Wednesday, December 5th, 2007 from 10:00am - 11:00am Pacific Time. Add to Calendar | Additional Time Zones

OR

Join the chat on Wednesday, December 5th, 2007 from 4:00pm - 5:00pm Pacific Time. Add to Calendar | Additional Time Zones

Saturday, November 17, 2007

MSBuild team wants your feedback!

If you ever used MSBuild (and if you have Visual Studio 2005, you probably do it daily) and have ideas how to make it better, MSBuild team is asking for your opinion. And if you use mainly Team Build, that should be of interest to you too, since MSBuild is the engine used to execute all builds.

Dan Moseley recently wrote a post that lists 11 propositions for future MSBuild features (though they numbered from 1 to 12 with number ten missing - I guess ten is unlucky number for MSBuild :). You can spend virtual $100 to rate the propositions and root for the features dear to you.

I urge you to go over there and vote with your money!

Tuesday, November 06, 2007

Changeset - a unit or set of items?

I came across this post on forums, and thought it is worth to elaborate a bit.

In a nutshell - when you merge from path X to path Y, and specify certain changeset to merge, will all files in the changeset will be merged from X to Y? The answer: no, only files that both under X and in the changeset will be merged.

From one point of view, it is not very logical, since when you select a changeset you probably trying to merge all changes in it. On the other hand, you explicitly specify source and destination path and expect to merge items only under path, so if you look at changeset as a specification of versions to include in the process, rather than as a "container" unit of sorts, the results are logical.

Another changeset usage that I have found to be puzzling for some users, is specifying changeset for "Get specific version". Again, many people assume that changeset here behaves as a unit, and expect to get only files included in the changeset. But changeset instead behaves as mere date/time specification, and therefore versions relevant for that changeset timestamp will be retrieved for all files in the specified path.

Overall, those nuances are pretty important to be aware of (especially if explained to the users in advance).

Sunday, November 04, 2007

How to extend TFS-related menus in VS

I wrote a while ago about extending TFS-related menus, and as a follow up just wanted to elaborate a bit.

Usually, the problem with adding menus related to TFS is that the parent menu GUIDs/command IDs are not well known. For example, the following post describes typical problem.

But once the GUID:id pair is known, adding the menus is a no brainer. The easiest approach I have found so far is to use EnableVSIPLogging registry setting. That simple setting may save countless hours full of frustration (and I am speaking from experience :).

And those IDs is not something that will change within same version of VS (though for 2008 probably some testing will be required).

Wednesday, October 31, 2007

Move your TFS stuff around - or get ready for that!

Just wanted to share a little bit of information - there was a release of a pre-release version of TFS to TFS migration tool on CodePlex. Countless were days of waiting for this tool, and here it has arrived (albeit in pre-release version).

I have yet to test it myself, but it appears that one should be able at least move the version control artifacts and work items between different servers.

As it is pre-release, it can probably get better, so I urge you to have a look and let the team know your thoughts.

Further details can be found at Ed Hintz blog.

Thursday, October 25, 2007

Orcas Beta 2 VPC expires soon!

It turned out that Orcas VPCs with Beta 2 will expire on November 1st, 2007. So if you have some data inside, get prepared (apparently it is Windows Server 2003 OS that expires).

See Jeff Beehler's blog post for more info.

Tuesday, October 23, 2007

Automatic shelving, anyone? No, thanks...

As I am catching up with the blog reading (still have some thousand posts to browse through in my blog reader), I came across interesting post by Grant Holliday.

In a nutshell, there is a new project at CodePlex by William Bartholomew, called QuickShelve, a little utility application that may be used to set up automatic shelving on your machine. Initially, I was pretty excited about it - I am a soldier in "small-free-app-save-big-buck" army myself. But then I thought better of it, and become doubtful that auto shelving is (will be) universally needed, and here is why:


  • You need to set up the app on your local machine, since this will only work when agent is running locally. Ah, I wish it could be set up at server...

  • When one shelves some changes, it is usually files at certain stage of development ("wow, it works now", "hell, it ceased to work", "i am afraid to change this" etc). Auto shelving, on the other hand, will shelve at some point in time and it wont be easy to identify what exactly is inside

  • And lastly - people should check in often. Admin (or team lead) can and should enforce that policy, and may be even monitor the shelving so that no one is tempted use shelvesets as substitutes for checked in changesets. And anything that takes you away from that "daily check in is a must" is a devil's work.



So overall while auto shelving in its current form might be a good way to prevent force major circumctances (hard drive crash etc.), I will wait until it can be set up and administered at server level. That would be a nice feature (Rosario people, are you listening?).

Wednesday, October 17, 2007

Musings on immigration

I am back after long hiatus and this time I'd like to talk about immigration. Just joshing - about migration really (though immigration is a hot topick too :).
I do hope that many shops working with Visual Studio that still do not have TFS in house are considering evaluating/purchasing it. And even if the question of featureset is satsifactory, the issue of migration is bound to arise.

Actually, first question usually is - do we migrate the history and legacy code, retaining the structure, including the artifacts from bug tracking system or should we start with clean slate green field repository?

And at that stage I think some facts and possibilities get overlooked.

History
When talking of history of all changes performed on the certain codebase for whatever number of years it appears people tend to hugely exaggerate its importance. Try to ask average developer - how often do you look up the history revisions? And how often those revisions are truly historical (not the one before latest)? Chances are (if the guy is not responsible for maintenance of legacy code) that the usage is fairly occasional.
Now, let's suppose that history is not migrated, but that previous repository is readily available for whoever needs that data. Would that not be a feasible solution for occasional usage?

Legacy code
Do you want to have all code in the same repository? On impulse, everyone answers yes. But do you really? If there is virtually no maintenance, and putting the code in new repository means that defect fix procedure, build scripts etc. needs to be updated, it might well be time consuming. So would it be better to have all code ever developed in one new repository? Certainly so. But would it be easier to leave maintenance only code in old repository? Probably yes.

Bug tracking
With the great number of bug tracking systems available (and many of them developed in house) question of migration may be very well dependent on the person performing it. But even with such person available, the same question is relevant as with history - are all those closed defects absolutely required? And do we need those 200 fields that anybody hardly ever use?

So by now I think you catch my direction. While migration of all data from one system to another is the best solution, it is rarely available - and that due to multiple reasons. And in my personal opinion (leaving internal company politics aside) some sacrifices can be made and even may be beneficial to the whole process. Just imagine your source code repository without all legacy crap lying around, restructured logically and not forest of folders that are there due to "historical reasons".

Of course the situation I describe is simplified a lot. And it is apparent that in large organization there will be the need to have both history and legacy code transferred to the new system. But instead of trying to take and transfer everything as is, it might be better to consider an alternative. To me the alternative is this - having old system repository online and accessible (perhaps read-only) after new system is introduced, and starting new repository from the ground up. Or at least consider that for some projects.

And that no migration/limited migration scenario may be especially relevant in case where there is no bridge between source/target systems. For example, how do you migrate from MKS to Team Foundation?

Overall, I do not advocate the "throw the old embrace the new" approach. But I felt in several cases (where people passionately demanded revisions history for five years in new system without ever using it in the old one etc) that alternative should be at least examined. And sometimes doing no migration can be actually be for the better! Dont you think so?

Monday, August 06, 2007

Final version of TFS Guide is available!

New version of TFS Guide is available at CodePlex! And here is a full announcement at J.D.Meier blog.

One very important part of new release in my opinion (in addition to the updated contents, of course) is that TFS Guide is now available not only as PDF but also can be navigated right to the chapter you need from CodePlex page.

Time to download your copy!

Monday, July 30, 2007

Root for the migration cause

Have you ever written a post or asked a question about migration paths to TFS from SVN, StarTeam etc.? Do you know someone who did?

Well, now it still may be not too late to head to CodePlex and vote for the migration you have been longing for. And no excuses! If you sum up all votes and compare them with number of questions "How can I migrate [smth] to TFS" the vote count is still short of couple of hundred votes.

Thanks to Martin Woodward for highlighting the voting in progress. I read about it and then forgot, but now I went there and voted. Did you?

Not only VS 2008 got released

Aw right, VS 2008 Beta 2 was released as you probably heard by now. The blogsphere is awash with VS-related posts, but this one is not about it :)

One for TFS guys - if you do Work Items customizations, have a look at custom WorkItem Date Picker control by Paul Hacker. It allows you to have a date picker control in a work item instead of free form text input which is provided by default. And if you find it handy or find it lacking something or have a great idea for custom control, do drop a line to Paul. In my opinion, we should have by now a library of custom controls, verified and ready to deploy, but we still do not and this one is great start!

Another for MSBuild guys - if you find yourself writing a lot of MSBuild in VS, Stuart Preston created a useful-looking VS MSBuild project template for you. Now you can have MSBuild project on a par with C# and C++ projects in VS!

And going back to Visual Studio - version 2.0 of Web Access for TFS is available (called now Team System Web Access Power Tool)!

(I think I had managed to fill this one with exclamation marks!)

Sunday, July 29, 2007

Adding TFS-related commands in VSIP package

If you want to extend Team Foundation Visual Studio integration, the easiest way would be to write an add-in using VS automation. Alternatively, you may want to develop full-blown integration package; while it gives you much more flexibility, thing also become more complicated.

One of the differences is the way to add commands/menus. In VS package implementation you would use CTC (Command Table Compiler) file for that, and you will be required to know the GUID:ID pairs for the parent menus when adding your custom commands. And getting this information might take some do, especially for Team Foundation related menus!

Obvioulsy, firts thing is to browse through the header files included in VS 2005 SDK. But the thing is that the GUID:ID information for Source Control Explorer/Work Items menus is not located with the rest of similar IDs in stdidcmd.h/vsshlids.h headers, or other headers in Include folder.
But never give up! There are two additional headers, TFS_VC_IDs.h file located under Program Files\Visual Studio 2005 SDK\2007.02\VisualStudioTeamSystemIntegration\Version Control folder and containing version control related menu and command IDs, and TFS_WIT_IDs.h file located under Program Files\Visual Studio 2005 SDK\2007.02\VisualStudioTeamSystemIntegration\Work Item Tracking folder and containing work item related IDs. Great thanks to Chad Boles@Microsoft for the info!

Interesting thing is, that the information is available on WWW - but I'll be darned if I could google that (and I am pretty good at that too)! So here it goes - there is a post in Brian Keller's blog that mentions those headers. Do you think you could find that?

And in conclusion, couple of more related links. If you have troubles looking up GUID:ID pairs for CTC definitions (which easily could be the case if you are trying to extend custom packages), the technique described in the following Martin Tracy's blog post may be very useful. In fact, using that approach I got the IDs right before getting the official answer :)

And if you want to understand CTC better, this here post supplies lots of useful details on how to create bare-bones CTC without headers and pre-compiler.

Sunday, July 15, 2007

.NET Update gets in the way of TFS

If you have installed (automatically through Windows Update or manually) an update for .Net 2.0, you now may encounter issues connecting to TFS as described in Michael Ruminer's blog post. But no worries, there is an easy way out for you!
The post suggested that you uninstall the update, but fortunately as I started on uninstalling it I have read the comments to the post.
Buck Hodges commented: "If you hit this, simply turn off client-side tracing that you or someone else previously enabled (e.g., in devenv.exe.config), since it's not on by default."
And indeed, editing devenv.exe.config worked like a charm for me. Beats uninistalling things, doesn't it?
Thanks guys for making the solution available at such short notice!

Friday, July 06, 2007

Maximum size of the file under source control (continued)

In my previous post I wrote about the deltas mechanim used to store files revisions in TFS database and (somewhat) lamented the lack of configuration and documentation of that.

Well, it seems that at least configuration is taken care of. As Richard Berg helpfully pointed out in the comments to the post, it is possible to specify "deltaMaxFileSize" parameter in web.config file of Version Control web service (for default installation located in C:\Program Files\Microsoft Visual Studio 2005 Team Foundation Server\Web Services\VersionControl).
The value of this key is maximum file size (in bytes) to perform delta algorithm on. For example, the following setting will be equivalent to default (16 Mb size):

    <add key="deltaMaxFileSize" value="16777216" />

I did play around with that setting a bit; I created a ZIP file full of small 1 Kb sized files and put it through check-out/check-in cycle removing single file from archive for each new revision. The database size was gauged using SQL Server Management studio to view properties of TfsVersionControl database. The setting did work as expected, and reverse delta algorithm indeed works amazingly well - with delta enabled the database size remained effectively constant when I added a new revisions of the file.

So as it turned out, you can optionally configure the algorithm; I'd say it is something to do if you contemplate storing revisions of large binary files. A word of caution though - as the setting is yet to find the way into official documentation on TFS, it is not supported at the moment.

Update: Important remark (courtesy of Buck Hodges) on side effects of changing the default value: "You'll also want to think really hard about setting the value any larger. The library that does this doesn't consume memory linearly (it's CPU intensive as well). It's not hard to run your server out of memory when you least expect it."

Monday, July 02, 2007

Thanks for the award

On Sunday I got an email with Microsoft Visual Studio Team System MVP award inside. Wow! What can I say - I am overjoyed to join the ranks of esteemed and respected Team System MVPs (many of whom I had a pleasure to be acquainted with either in person or virtually). I hope to be worthy of the group and will continue to participate in TFS community (and perhaps now with all additional resources available to MVPs I will be able to give better answers, too!).

Using the occasion I'd like to congratulate another newly awarded Team System MVP, Steve St. Jean. Many a time I came across his blog while browsing WWW for an answer on intricacies of Team Build. Congratulations Steve!

Thanks everyone who helped me in getting there!

Thursday, June 28, 2007

Maximum size of the file under source control - should you care?

In this forum thread I came across very interesting piece of information that I feel might be interesting to the community at large.

Probably anyone using source control application asked that question at some stage - what happens if I check in 500Mb disk image file? For some source control systems (f.e. VSS) the answer is relatively simple - you'd end up increasing your DB size by approximately 500 Mb every time you check in new revision of the file.
However, for TFS only deltas between the revisions will be saved, so your database will not become bloated if you store several versions of that huge binary file. Or so I thought - because it turned out that deltas mechanism is used only for the files that are smaller than 16 Mb. May be that's just me, but that number eluded me in the original documentation on MSDN and I was under the impression that if someone maintains several revisions of huge binary file it is not a big deal.

But in reality, that can well create a problem. In v 1.0 of TFS there is no permanent destroy, so if one created twenty revisions of CD image that will immediately affect your database size, and there is no way back!

The reason for this (conveniently explained by Richard Berg in the forum thread) is understandable - indeed, calculating deltas for large files can adversely affect server performance, and therefore it is disabled. But I am not sure that I like the way it is not specified in official documentation and not configurable on server.

Obvious way of making sure that your database is not affected by those binary files is not to store them in TFS at all (and create maximum file size check in policy to enforce the file sizes). Or if you do store them, store only one revision and use branches if you need to reference this file. While both approaches take some additional effort for policies enforcement and user education, at least now you can make sure you can explain that sudden increase in database size. Look for new revisions of those large files :)

Tuesday, June 12, 2007

TFS ripoff or back up your money (updated)

Today I come across pretty interesting post on MSDN forums. To give your some background: recently (about a month ago) some company started to market suite of products that supposedly perform migration from SVN to TFS, backup of TFS projects, work items etc.

The inability to backup separate projects has long plagued TFS community; the company was mentioned in several blogs and in several MSDN postings. I was saying to myself "Kudos to those guys for doing that enormous work and implementing functionality even MS itself was unable to implement". Until today somebody (apparently the buyer of the said software) posted the following:

"... is a fraud company. I had purchased one of their products. The buggers provided me a *** software, which was good for nothing. I have even heard from a mate of mine in France that they charged the product amount and never gave the product.

Today - Even their website does not work."


Well, that surely sets off some alarms. So I indulged a bit in some hobbyist investigation and here goes the list of interesting facts:


  1. The company site is offline as of now; the domain was registered on April 19th, 2007, shortly before the first mention of the products offered appeared.

  2. Neither pricing information nor trial downloads of the products were available at the site (the site is only available now through Google cache).

  3. All posts pointing to the company site on MSDN forums were posted by the same user "gauravmangla". Looking at activity of the user one may see that user was registered on May, 2nd 2007 and all his postings are linking to the mentioned company site. No other posts by the same user are found.

  4. The only feedback posts on MSDN forums is by some user named "john.matthews01", registered on May, 26th 2007 and whose three posts are concerned solely with praise to the company software. No other posts by the same user are found.

  5. No customer feedback from real customers is found using Google (I mean feedback from real-life person non-affilated with the company); only mention of the software in TFS related blogs.


Before I compiled that short list, I was inclined to think that the company in question is new micro-ISV company (of the kind regularily discussed at JoelOnSoftware forums), perhaps with very decent offering that help lost of people in TFS crowd; but now I am of opinion that some fraud scenario might have been at work. My conclusions from that story - before buying something

  • Download trial version; if no trial is available then reconsider

  • Request client reference/search for users feedback; if none found then reconsider

  • If the company is new and the price of the software is significant for you, reconsider

  • Use credit card to pay for the software to make sure you get your money back in case of fraud



Update: I stand corrected as the owner of the company in question posted an explanation of the situation; the site is inavailable due to some internal problems ("... a disgruntled employee left our company and he was responsible for our web site. He changed all the passwords and removed all the web site code"); no products was shipped so far due to export limitations so the rumours of products beind paid for and not shipped appear to be just rumours without any solid foundation.
I do hope the company will be able to resolve the internal situation and resume business as usual.

Thursday, June 07, 2007

Move caveats (I like to move it move it ... not)

Recent post on MSDN forums reminded me of an important issue in TFS Move functionality.

To give you a short summary - you move file (or more frequently a folder) within your source code repository using TFS Move command. The sky is blue, everything works all right - until at some later point you decide to retrieve the item version before move changes. Then in VS GUI you will receive an error; nothing you do will get you that version using UI. Only possible workaround is to use tf command line get with an old item name/path specified and versionspec before move.

While for most of us it is hardly a show stopper, think about a scenario when you move a whole team project subtree (that's what the author of the original forum post did). If you interested in previous revisions in that setup - that might be pretty cumbersome.

I do not judge the implementation of that; in my opinion you can argue successfully both ways (bug vs logical implementation). But it is certainly the fact to be aware of before you check in the results of your latest move operation.

Wednesday, May 16, 2007

TFS local cache: servers, workspaces and more!

Interesting part of TFS object model deals with locally cached information. Using that API you may iterate over local workspaces, determine whether specific path is mapped and even find out TFS server location URI. I have touched upon it in the previous post, but recently related question was raised in forums again; so it appears that the topic is ripe for some additional clarification.

The journey into cached information starts with Workstation class (that and other relevant classes are part of Microsoft.TeamFoundation.VersionControl.Client assembly). This singleton class exposes static Current property which returns Workstation object for the current computer. The class exposes several methods of interest, for example IsMapped method allows you to detemine if specific path is mapped in some workspace.

In my opinion, most valuable method is GetAllLocalWorkspaceInfo. As follows from its name, it returns set of WorkspaceInfo objects that comprise the local workspace cache.
Using WorkspaceInfo class you may perform several useful tasks:


  • Obtain TFS server URL by using ServerUri property; that may come in handy if you do not have the server URI and do not want to force user to specify it

  • Retrieve Workspace object using GetWorkspace(WorkspaceInfo workspace) method of VersionControlServer or GetWorkspace(TeamFoundationServer) method of WorkspaceInfo object itself; from that point on it will be possible to perform usual version control related operations

  • Use WorkspaceInfo properties to view local workspace information without connecting to TFS server


While usually the location of TFS server is known and one may use QueryWorkspaces method to get list of workspaces, the approach described above will help you out in cases where one needs to find out TFS server locations or obtain information about local workspaces without setting up a connection to server.

And to conclude, another convenient way to retrieve list of server URIs would be to use GetServers method of RegisteredServers class. It will return the list of all server locations stored in registry (the list that appears in "Connect to TFS Server" window).

Tuesday, April 24, 2007

Word on TFS Migration and Synchronization Toolkit

As it has been widely blogged, TFS Migration Toolkit pre-release version become available on 20th of April.

I have promptly downloaded it to have a quick look - after all, migration scenarios is fairly common occurence with TFS implementations, and so far there was not much one could do. But I'd say that the toolkit fell somewhat short of my expectations and here is why:


  • It appears to be geared almost entorely to third party developers; if I am going to write commercial solution for Quality Center/TFS sync it certainly makes sense to try and use it. But I fail to see how average customer can benefit from that, or myself as the person implementing TFS - the complexity of writing migration utility appears to be significant

  • The example of WSS converter is not something really helpful - of course that's my opinion only. I'd say most people are looking for migration of other version control/bug tracking systems to TFS or synching several TFS boxes rather than WSS solutions. So from my perspective, building an example for WSS that includes all quirks of WSS is not the best candidate for the task. One wants example to be relatively simple, and including WSS does not promote that cause. What about demonstrating power of toolkit on VSS, or ClearCase or BugZilla or Quality Center?

  • I am a little unsure of DB usage in the toolkit. So I have to use MS SQL Server 2005; but I remember that license of MS SQL Server 2005 installed for TFS prohibits usage of it for other appplications. Would the toolkit be an exception? Or the license should be acquired? Express edition could be an answer, but certainly not for everyone.


But overall, I am pretty happy about that toolkit being released. At least now there is an alternative to writing things from scratch, and availability of source code will make things easier to understand (ever tried to tweak VSSConverter?).

Sunday, April 08, 2007

Increasing source control permissions granularity

It is a known limitation of sorts in TFS version control, that you cannot set the permissions on specific change type - that is you can specify desired permissions on check in, but not on check in of "add" or "delete" changes specifically.

Interesting question from forums dealt with how to implement permissions that will allow folder to support only addition of new files.

So far the best (or at least "cleanest") solution for that appears to be creation of custom check-in policy, that will allow check in of only specific change types. Sample policy was created by Leon Mayne and is available for download here. That policy allows the user to check in only branch/merge related changes.

To extend that approach would be to allow the user configure the permitted change types. As the policy sources are not available as of now, you will probably have to contact Leon for that :)

Friday, March 23, 2007

Discarding changes in merge

Recently, one merge-related question came up repeatedly on the MSDN newsgroups, and I thought it is worth to dedicate a post to the issue of discarding the merges.

The typical scenario would be this - you branched the code, changed files and upon merge you do not want to merge the actual changes. But you do want to make sure that changeset in question will not appear in the future merges.

The best way to do that is to use discard option of the merge command. The option is available only through command-line client (tf.exe) and basically performs merge without taking any changes from the source to target; its only purpose is to update the merge history between source and target and thus prevent the discarded changeset appearance in the future.

Few tips along the way on using the discard command switch:
1. If you trying to discard specific changeset, it needs to be specified twice, as the command requires that the range of changesets will be specified (the example merges changes in changeset 666 made in file1.cs from branch to trunk):
tf merge $/project/branch/file1.cs $/project/trunk/file1.cs /discard /version:C666~C666

2. If you are merging folders, do not forget the recursive switch (the example merges all changes in changeset 666 made in branch folder to trunk folder):
tf merge $/project/branch $/project/trunk /discard /recursive /version:C666~C666

Overall, this command may come in real handy at times; it is a pity that it's only available through command line.

In conclusion, great thanks to Richard Berg for clarifying some of the above and for his educational efforts on MSDN forums.

Update: Judging by the forums questions, it appears necessary to mention, that after performing merge you are still required to check in the changes; though there is essentially no change in file contents, TFS paradigm requires check in of every change. And if you think of it, the whole flow should not be different from any merge.

Wednesday, February 28, 2007

How work item permissions affect query results

Today I come across interesting article (well, it is only paragraph worth of text) - in MSDN of all places. Reading that short one in time could have saved me several hours :)

The gist of the thing is this - if you have set permissions to on area paths (deny "View work items in this node"), then you may be in for surprise, as that permission will affect work item query results. If you think about it - that's only logical if the person that has no permissions do not see the forbidden work items.

But if you use those permissions, you should explain that aspect of the behavior to every user, as it may lead to rather unpleasant explanations ("I saw in my report only five bugs and now you tell me there are fifty!"). That makes me wonder - would the indication of restricted work items visible in reports (perhaps similar to indication of restricted branches in item properties in version control) make the situation better?

At any rate - if you as much as think about changing the area path view work item permissions, read the article before and think again! As MSDN small print says "... the system does not warn the user the query results are incomplete ..."

Exam 71-510: I still do not care about command line parameters

Yesterday I took beta of exam 71-510, and man, I was in for the disappointment!

Usually I like taking beta exams, as they are much more challenging than the regular ones (and free :). Last betas I took were 71-551 and 71-552; the exams were not easy but quite interesting, the answers that I was not sure of led me to some important things to read on.

And here goes TFS exam - what a messy affair! I do realize the difference between C# WinForms exam and one dedicated to the product, but still - I think the TFS exam is mostly useless as far as certifying knowledge of Team Foundation Server goes.

Here are my main pain points (I would like to be more specific, but the legalese you agree to in the beginning of the exam will probably get me sued):


  • There are questions that test the knowledge of the parameters of the command line utilities that most users run once in a product lifecycle. And whats more - I have used them more than once but do not remember a single parameter. And why should I?

  • There are way too many cases where the operation performed illustrated using command line (even in cases where there is valid GUI alternative, and command line is almost never used). Again, why exactly would I remember command line parameters if I never use it?

  • Some command line utilities are really obscure. Yes, I know that they exist and what is their purpose, and even used them, but would that be the ordinary case?

  • Some components of TFS get unfair share of questions in exam. I'd say that most people use Version Control, Work Item Tracking and Team Build (with some Reporting thrown in). I have yet to see organization (succesfully) using integration with MS Project or one modifying the project guidelines. Or should custom controls be used extensively in Work Items? If you judge by the exam questions - all those things are mandatory parts of any TFS installation

  • And the last one - some questions were really verbose. To the point of actually obscuring the question (and in other cases the answers). Why do we have this kind of GMAT approach in technology certification? What is the purpose of artificially shadowing the meaning?


Overall impression was that questions were just thrown together, and I did not see clear picture of what the exam was supposed to certify. The questions on WIT and Build were good, but the rest ... - well, I said it all above.

I do not plan on taking it once it goes live as regular exam, but I sure hope that Microsoft will change the exam before that (and some 30% of questions easily may be wiped out - there is no improving them).

And yeah, to back up my rant somewhat - that was my 9th MS certification exam.

UPDATE: As of 16-Apr-2007, I have passed the exam and now bear a proud title of "Microsoft Certified Technology Specialist:Microsoft Team Foundation Server, Configuration and Deployment". Hopefully, that makes my point of view less of a rant and more of a constructive suggestion nature :)

Thursday, February 15, 2007

How to check if file is already in TFS repository?

Recently I have come across one development issue with version control object model. In development of VS TFS add-in, it was essential to check whether some file is already controlled (that is added to source control repository).

It is tempting to use TryGetServerItemForLocalItem method of Workspace class - if the item does not exist on server, it will return empty string. But nothing in the life is that simple - all GetServerItem/TryGetServerItem (as well as GetLocalItemForServerItem and TryGetLocalItemForServerItem) do not test items as to existence in the repository; in fact they are very thin wrapper over simple parsing methods.

Then naturally there must be something in VersionControlServer class, right? When I started looking for that method, I come across some pretty weird solutions (like using GetItem method - and in case exception is thrown, then item does not exist).
But in fact there is special method that does exactly that - best solution for finding out if item exists at server. It is (not too) aptly named ServerItemExists and it even has two overrides: first one is simple check whether item exists by path and item type, while the second one allows to supply additionally item version and deleted status.

So be advanced and use appropriate method! And thumbs up to Microsoft for good design of object model!