We’ve been working with git as a distributed version control system (DVCS) for Fedora Documentation this release. All of these documents (Installation Guide, Release Note, and various README files) are authored in DocBook XML, so they work great within a VCS.
Sure, it’s cool to work entirely offline, do granular commits, and merge perfectly with the codebase when I’ve reconnected later. That’s all good. What has been a real gift from the gods the last 48 hours is how fast it is. Especially the operations via the network; I see very little bandwidth used and it goes by very quickly. No more ‘cvs ci’ and wait.
Another great aspect is how a DVCS such as git supports a distributed team working in real and non-real time. Unlike with old VCS, it is much easier to work divergently and still have it merge together. For example, after I do a number of changes, I make granular commits, which occur only in my local repo (clone.) I can do a commit per file, so I have a specific changelog tied to the actions, and it is easier to undo or manipulate from that commit, which has it’s own UUID. Then I only have to run ‘git-pull –rebase’, which pulls down any changes pushed by other collaborators, updates my local repo, and replays my changes on top of this new copy of HEAD. Then I can ‘git push’ without any conflicts arising.
In a VCS, if you find yourself out of sync with the latest from HEAD, you either have to sync and deal with potentially spurious merge errors, or do a manual version of what git does. How many times have you copied your changes from out of a Subversion or CVS directory locally, reverted back to HEAD from the central repository, then manually reapplied your changes?
If anyone is making them … I want a t-shirt that says “praise git”.