Saturday, December 15, 2007

ORM Hunting (in the .Net world)

I have been hunting around for some code generation tools to play with. I"m really looking for something very simple like the Ruby on Rails generator. I've come to the conclusion that no one want anything easy in the .Net world.

I've also been looking at some DALs, ORMs, etc. Really, most of the DALs and ORMs offered in the mainstream are just WAY too much. I really have to think that if you need some of the more advanced features of these systems you may want to relook your design. Heck you may want to read up on doing "The Simplest Thing that Could Possibly Work."

Anyway, I use Castle's great framework called ActiveRecord. It's so simple that you really don't need code generation. As long as you keep things simple when you begin using it, you can slowly move towards the advanced stuff (if you really need it). So what was I REALLY hunting for. I'm really looking for a database schema upgrade (or migration) that's as simple as activeRecord.

ActiveRecord has a nifty CreateSchema method. This method scans all the ActiveRecord classes in your model and creates a database schema for persisting your objects. Of course since ActiveRecord is also an ORM, it provides you nifty methods for instantiating and persisting objects (to the newly created schema). The real sticking point is this....CreateSchema deletes the database everytime! THis means all of your data is lost!

For a while we've used a very kewl in-house migration library. Whenever you modify or extend your database schema you just add another migration. It versions each migration and the database, allowing for a very painless upgrade everytime the software is run. Before our migrations tool, we kept large repositories of bloated DDL/SQL scripts...yuck.

Now, back to my hunt. It turns out that I was really looking for a tool that would look at my model and generate (or upgrade) the database schema based on the model. ActiveRecord already does the generating....so what I need is an upgrade option.

I've heard that Hibernate has such a feature...so far nHibernate (ActiveRecord wraps nHibernate) does not have this support. Maybe someday. Maybe I'll write something. What boggles me are the number of generators that focus on generating the model based on an existing database schema. Now, I do understand that many databases with lots of info are out in the field...and generating a model based on that schema might make sense. The ones that seem silly to me are the ones that generate a new set of model classes everytime the schema changes...instamagically!

It's a challenging thing to create these sorts of frameworks. I guess. I'm not sure I'm comfortable with a framework that generates and regenerates the model in such a willy-nilly way. Wouldn't this be a refactoring nightmare? Sure, you don't want to constantly refactor your database schema either. Other applications may depend on that schema. But, by generating the schema from the model your schema is less likely to change often or drastically.

Here are some of the ORMs and DALs I found:
  • Castle ActiveRecord (.Net)
    Castle's port of the ActiveRecord pattern wraps the nHibernate library. ActiveRecord makes it very simple to create a model and persist it to a database. You don't need any special tools. You create a simple XML configuration (connection strings etc.). To make things really simple your classes inherit from an ActiveRecordBase. This base class provide all the methods you need for creating updating and searching for business objects.
  • Hibernate (Java)
    I haven't used hibernate yet.
  • nHibernate (.Net)
    I've used nHibernate briefly. Unlike ActiveRecord you have to create complex XML mappings between your model and the database. I'd recommend going with ActiveRecord.
  • ActiveRecord (Ruby)
  • doodads (.Net)
Kroon had a nice article on several of these. Also check the Server Side's article on ORMs. It has a pretty complete list.

Planning Poker on Rails

I'm starting a new Open Source project. It will be a Planning Poker game for distributed teams. Here's the blurb on the project site:

The planning poker on rails website will allow distributed teams to play planning poker. It will track user stories and the points assigned to each story. It should also allow for quick access to archived stories as references for future planning games.

Each player should be allowed to place a card (estimated effort). Once the cards are all placed the application will reveal the estimates to everyone. If no concensus is made then each player will have the floor to make comments. As each player makes his or her comments the application will move to the next player. Once all players have no more comments OR a player presses the VOTE NOW button. The computer will take another vote this continues until concensus is made OR all player agree to table the vote.


Before you google it...YES another project already does this. So, you may ask, "why re-invent something?" Well, here are a few reasons:
  1. Of course I think I can do it better!
  2. I don't think the other is open source....just free to use.
  3. I really wanted to play with google's code projects.
  4. I wanted a project for which I would be my own user.
  5. Did I say I think I can do it better?
  6. I wanted something that begged for Ajax and Prototype stuff.
  7. I wanted a project that had users interacting in realtime over the web.
  8. I really like using Planning Poker on my teams and I wanted something that was designed around "MY" rules for the game.
  9. Just want to play with a few technologies.
Wana check it out? Go Here!

Wana help? Email me!

Wana check out the competition? Go Here...

Sunday, November 25, 2007

Bashing Agile / XP

I just read an article that takes a very cynical view on XP and Agile methodologies. I wasn't allowed to comment without signing up for something so I will comment here.

Check out the post "On the trail with the Cowboy Coders."

I think the author misses the point of Test Driven Development (TDD) and the XP practices, as a whole. His suggestion that XP is "Cowboy Coding" couldn't be further from the truth. In fact (this is always said) XP takes a great deal of discipline. All of the practices and values must work together or you will never reap the benefits of XP. In particular, TDD is not merely a way of making sure that your code is tested. This is a common misconception amongst those who have been newly introduced to TDD.

The reality is that TDD serves as a design tool. Now, XP does "shun" up-front formal design. But XP does not ask us to do "no design". The design is expressed differently....the design evolves. Let's take a look at how TDD helps our designs evolve. First, users write Acceptance Tests. Properly written acceptance tests are the XP team's first expression of a system's design. These tests are used as a road map for what the user should experience when interacting with the new system. Second, we have unit tests. Unit tests stress those portions of the system that are invisible to the end user. The scope of Unit Tests should be determined by the team...for instance, we tend to write a test fixture for each discreet component of a system. Finally, we write tests BEFORE we write code. Once we have a failing test we begin writing code until the test passes. It sounds strange, until you realize that the tests are more than tests...they are the design document for our new system.

Ask any XP team, it takes a lot of discipline. Most traditional teams save the tests until the end. They see the tests as something that would be "nice to have" but not really necessary. In general they just never get written. This coupled with another reality, that most upfront designs go out the door after a week or two, makes a pretty strong argument for using TDD as a design tool.

Now, before I conclude my little rant, I must remind everyone that XP/TDD does not happen in a vacuum. All the practices need to be applied. Remember "evolutionary design" happens as you go! Tests should be written when you need them not just because you want a bunch of tests. So please don't run out and write a ton of tests for crap you'll be working on next year. I guarantee that your requirements will change and you will have wasted a lot of time!

Friday, November 23, 2007

What is Important (Warning: Disconjointed Thoughts abound)

Well, It's been a while since my last post...trust me I've been busy!

I've been reading quite a bit about Agile Developmet trends. Always ready to learn some new nugget of goodness. I always find myself asking, "What is really, really important to an agile project?" Is it simplicity? Test Driven Development? Pair Programming? Evolutionary Design? One of so many other values and practices?

Of course my first response is, "it's all important, XP requires a tempered application of ALL the values and practices."

We have so many around us that don't like this answer. They wonder, "where are the controls? What is the process?" My answer is, inevitably, "the team is the control and the process. They are guided by a coach and a good faith effort to provide regular software releases."

I must admit that, in business, we need to"know". Know when the software is going to be done. Know when the software is not going to be done. Here, I argue that most methodologies fail to effectively predict these things. So, why not select the methodology that wastes less time on fruitless planning and more time on fruitful development?


I am a huge fan of pair programming. Actually I am a huge fan of many of the more painful practices;TDD, Integrate Often, Simplest Thing That Could Possibly Work, etc. I think they force us to do the right thing. I've also found that most of the practices hinge on pairing. Navigators always seem to be sticklers for doing the right thing.

As usual my thoughts always return to simplicity. I like to think of simplicity as the "golden value". It's importance can never be overstated. Everything that goes wrong in software development can somehow be traced back to some complexity. Simplicity makes the code easier to write. Simple requirements require less time to design. Simple designs require less time to implement. Simple code is easier to integrate and maintain. Easier means faster and faster means a quicker and greater return on investment.

So what makes something simple? Is it simpler to just drop a datagrid on a webform or to use MonoRail views and Brail? What about IoC containers? Is dependency injection complex or simple? I bring up these points for a reason. I read a blog with a pretty half-hearted flame about these same things. I must tell you that I am easily side-tracked by these sorts of discussions. I want things to be simple! Shouldn't WebForms controls be simpler? Or why go through the through the trouble of implementing an IoC container? I'll try to answer....

I could just say, "these are the patterns we choose to use and it makes the code simpler to maintain and refactor (remember refactor mercilessly)." This works academically, but the real truth is that it "feels" better. The WebForms designer still has a cludgie feel to it. The code it generates is nasty. Also, the MVC framework takes away 90% of the design decisions and, as a framework, is simpler to wrap your mind around. The IoC containers simplify the bulk of the coding...the IoC container is complex...but it's done already! Finally, Dependency Injection really supports "once and only once" and allows you to design objects that do or express ONLY what they are meant to...that's simplicity.

My next post I hope to expand on some of the "big words" and ideas above. Please comment on what you believe is most important!

Sunday, May 13, 2007

Tactical Scope Creep
In my experience, scope creep is one of the greatest threats to the completion of an iteration. I would even suggest that the most dangerous "scope creep" is the kind inflicted by engineers.

Most discussions of "scope creep" tend to focus on the customer inflicted variety. For agile teams, working iteratively, this sort of strategic creep has been managed (see embracing change). The creep we don't always catch is the creep at the tactical level.

Have you ever been faced with a simple story and somehow ended up making a Rube Goldberg machine? Damn the schedule and designed a complex framework in hopes of using it again later? Regretted wasting time on a complex API that did no more than paint your project into a corner?

In the end, all of these cases are examples of one thing...tactical scope creep. I myself have never fallen prey to these vices (yeah right). Seriously, I can't count the number of times I was going to write the next great framework that would solve ALL future problems......I wish someone had been there to say YAGNI!

So, what do we do? We apply the XP practices and principles.....duh. Remember, XP is a somewhat wholistic methodology. You MUST plan iteratively, have users write stories and play the planning game. Ok, disclaimer out of the way, here are a few XP principles/practices that will help you conquer tactical scope creep:
  • Pair Programming
    No sh@t...it works. One navigator, one driver. Two minds are working on the code and the design. This creates a natural 2-way sounding board for new ideas (as opposed to having NO sounding board in normal programming). Bad ideas and transgressions of the other practices are identified and squashed quickly.
  • YAGNI
    You Aren't Gonna Need It
    ! This one is a no-brainer. If the story states "Users can write coments to blog entries", then write a solution that accomplishes this! Notice that this story doesn't suggest that users can view comments. It doesn't suggest that authors can manage comments either. SO...don't worry about it! Suggest that a user write stories for these other features. Writing very small stories is a good practice anyway. Small stories are easier to estimate and easier to focus on. Of course, differentiating between YAGNI and flexibility is no science.
  • TSTTCPW
    The Simplest Thing That Could Possibly Work. This practice dives deeper into the tactical design than YAGNI does. TSTTCPW asks us to simply make a duck a duck. If writing a file to the disk is simpler than writing it to a database....I guess you should write it to the disk. Later if requirements change, and you have properly encapsulated responsibilities, you can refactor to a database. Of course, like most of the XP practices it's application is subject to team arbitration. If a pair can't agree then let the WHOLE TEAM (see below) decide if something really is TSTTCPW.
  • Simple Design
    Over engineering a problem is the most frequent cause of tactical scope creep. If you can get away without a factory/service tier then do so. If you can use an O/R mapper (such as ActiveRecord) then do so. Stay away from glitzy user interfaces....wait until someone writes a story that requires a "glitzy" user interface.
  • Whole Team
    Especially the idea of putting all engineers in one room. Let the team police itself. Inform the other pairs when you are about to make a complex design decision. If your argument sways ALL of them then go for it!
  • Fail Fast
    Ok, this is actually a Test Driven Development (TDD) practice but it will help. Get something up and running quickly. Don't let yourself ponder the intricateness of the problem at hand. Over-engineering almost always starts in the "minds" of smart people.
  • Small Frequent Releases
    Your users will love you AND you will love being loved! That killer new framework would have to be pretty damned good to give you the same validation as a satisfied user. More likely the time you waste designing the killer API will overshadow any benefit derived from it.
Above all, work with your users and testers constantly to improve your understanding of their needs. Also, avoid becoming a simplicity/YAGNI drone. XP is about putting good software before the methodology.

Now, some things can slip through the cracks. Most will be unimportant. Some will be VERY important. Here are some things to keep in mind when you are getting ready to drop the YAGNI bomb:
  • Security
    This is the most serious potential problem. Security, especially on the low level, almost never makes it's way into a user story. I would suggest putting a security expert on your QA team. Have them write stories describing the security holes that require attention.
  • Accesibilty / Multi-cultural / Usability
    Once again these can be handled by placing an expert on your QA team.
Another option may be to track a set of "perpetual stories".

Well, I hope I haven't wasted too much of your time. Please feel free to comment, complain or flame!

Sunday, March 25, 2007


ASP.Net Frustration

It's very hard to articulate "why" I am frustrated with ASP.Net. Recently, I found a post that, while not articulating the "why", does articulate the feeling:
I think what annoys me the most is that I really like C#. It’s a great language, and the .Net framework is awfully well put-together. I really enjoy spending time with C#, but then along comes ASP.Net like the loud, overbearing, half-drunk uncle at a family reunion, tips over our picnic basket, and ruins everything. - Deane
I've tried exploring this subject on many occasions and come to the conclusion that it is a subject best discussed over a pitcher of beer! Before I go on I must give thanks to the guys (and gals?) at the castle project and their incredible Rails Framework Monorail.

So, ASP.Net is frustrating. To be fair to many who have bought into ASP.Net, ASP.Net sells itself very well. As a technology it is very heavily driven by marketing needs. The MS sales force wanted a technology that "sells"...and they got it. Viewstate, seemless event handling, Components, a response and request pipeline that can be "easily" hooked, and many more features make it a must have technology!

Wow, why the frustration? I think until you've tried something better (see rails above) you just accept the frustration. Heck, it's still better than the procedural approach of ASP or PHP...right?

The easy answer is "developers should know more about the http process and ASP.Net hide too much." That's always sounded a little like a cop-out to me. Don't get me wrong...I agree with the basic premise but I don't think it's the framework's problem. I see a real usability problem with ASP.Net.

ASP.Net wants to hide the ugly details of maintaining the viewstate between the web browser and the server code. The idea is great but in practice it seems to be heavily dependent on the IDE. Webforms seems to give us great options for seperating the UI from the logic. In practice though it gets a little hairy.

Where I work, we often put together "Scafolding" views that are meant to be thrown away or refactored once we've got everything working on the back-end. The Webforms designer in VS.Net allows you to create "Scafolding" very quickly....but when it's time to refactor, the code is difficult to un-hitch from the IDE.

Another problem with the Webforms approach is how verbose the ASPX code becomes! Unless you do everything in the Designer (which isn't much) you often need to drop into the code. Looking behind an ASP.Net page or into the hidden regions is like opening a pandora's box of problems!

Finally, why does the page event pipeline need to be SOOOO complicated! When we were using WebForms I would often override the whole thing just to get a little control back! The Framework should make it easier to munge with the request/response pipelines but in many ways Webforms makes it harder.

OK...you've wasted another perfectly good 10 minutes reading my rant. As usual I welcome the wisdom of others in all matters about which I write.

Wednesday, March 07, 2007

Castle Project and Monorail

I just wanted to give a quick nod to the guys at the Castle Project! We've started converting one of our major applications to their Monorail framework.

For those of you unfamiliar with Castle or Monorail, Monorail is an MVC framework for .Net web applications. It completely replaces the webforms framework with a much simpler (and more powerful) rails style implementation. The project includes an OR Mapper (ActiveRecord) that uses NHibernate as it's back-end. Their ActiveRecord implementation greatly reduces the time it takes to map objects to the database. They support several view engines based on Nvelocity, Boo (brail), string template, and some others. They also have a basic generator for scaffolding and a basic migrations framework is being developed.

If you are in a .Net shop and looking to take advantage of Rails/MVC or if you are just fed up with the overengineered webforms model then check out Monorail!

Here is a link to my Castle bookmarks: http://del.icio.us/rss/dpupek/castleproject

Monday, March 05, 2007

Agile Team Evaluations

If you didn't know then I'll tell you now...... I am a huge proponent of XP. I don't much care for Scrum.....it just doesn't work for me, deal with it. I recently read an article "Should a ScrumMaster Give Performance Appraisals?" For those of you who are unfamiliar, a scrummaster is the same as the XP coach.

My first thought was, "who else is really qualified?" My second thought was, "yeah, who else is qualified?!" Now, I'm not a big "Performance Appraisals" kinda guy. Just seeing the words makes me feel kinda sick. I've rarely written a performance appraisal that I didn't wish I could change later. Even when I was in the Army (yes I was in the Army for 12 years) we prepared many "Performance Appraisals" called Evaluation Reports. If it was a good soldier we scowered the last years calendar for good stuff he had done. If it was a bad soldier we scowered their folder for negative things to say. All said, it wasn't very constructive.

Now, to be fair, we did prepare many accurate appraisals. In the Army it's easy. You have a set number training events each year with a very specific set of tasks to be evaluated. You perform these same tasks year after year, event after event. They change sometimes but not often enough to really be noticed. You do well on the tasks.....you get a good appraisal.

Software, generally, is different. Every "new" feature is....well...NEW. In other words an unknown. In most cases, new features involve designing something that has never been designed. So, assuming that the unknown is hard to evaluate, appraising the performance of those working on the unknown must be hard as well...right?

Now that I have argued against "Performance Appraisals" I guess this post is finished...right? Wrong. The author, of said article, IMHO, missed the point. First, a "Performance Appraisal" is very difficult to administer, especially for someone not working closely with the appraisee (is that a word?). Second, software development, especially on agile teams, calls for a completely different approach to appraisals.

We need to accept that an agile team really works as a self organizing organism. No one person can be blamed for the success or failure of a team. That suggests that we really can't evaluate a single person based on the team's performance. So, do we just dispense with "Performance Appraisals"?

Before we answer that we must understand why we have "Performance Appraisals." For most the answer would be, "'cause HR told us to." The ideal answer is "to make the team better", the reality is "to determine who should get a raise" and in the real world both are probably correct. I guess we need "Performance Appraisals"? We'll assume so....

So, how do we design an appraisal that meets both needs (for betterment and promotions)? First, frame it around those competencies that are most valuable to the team. This is where I love XP....values and practices! I'm not going to list them here (it's late) but most intelligent humans could write a set of simple competencies based on the XP practices or the practices of their flavor of agility.

Second, with your handy set of competencies, have each member of your team evaluate themselves and each other member of the team (yes, I know the many reasons this is unpopular). Given the belief that the appraisor should work as close to the appraisee as possible then what bettermethod to use.

Third, sit down with each of them and discuss (yes face to face like human beings) how they did.
Be creative here, each team, depending on it's age and experience will require varying levels of "discussion". For example you may want to talk to them sitting side-by-side rather than "review board" style. Also, use good judgment. Don't just add up the numbers and say "tom" is 5 points better than "john". The whole point is to help the team be more productive. Basically, try to weed out the pettiness and let the person being appraised drive the discussion.

Finally, write an appraisal on each person, as the coach. Each organization will have some different way of recording them. In most cases you'll have little control over this. Either way, the point is to make the appraisal "meaningful" and give the person real insight. Most of all be fair and if you must write anything negative always recommend ways of improving.

Well, that's my rant today. Remember that this (and most of my diatribe) is only an opinion and likely one that could be changed. So, please, no flaming. But, intelligent arguments are not only accepted but encouraged!

Sunday, March 04, 2007

5 Levels of Planning

I just read an interesting article that breaks down the Agile planning process into 5 levels:
(This is slightly modified)
  1. The Vision
  2. The Roadmap
  3. The Next Release
  4. The Next Iteration
  5. The Next Day
Essentially, each agile planning session occurs in one of these levels. With each level dependent on it's predecessor. Ideally, you need a vision before you have a road map, road map before you plan the next release...you get the idea.

This really made sense to me. For years now I've used a somewhat complicated model to describe what it is we do. This model, oddly enough, can be easily distilled into this 5 level process.

In the table below I have identified each level of planning with WHAT being the expected artifact and the WHO generally describing the participants:

Level

What?

Who?

The Vision

A Vision Statement

Chickens

The Roadmap

Roadmap (duh)

Chickens

The Next Release

Release Plan/Stories/

Chickens/Pigs

The Next Iteration

Iteration Plan

Pigs

The Next Day

Engineering Tasks

Pigs


In the first level, The Vision, your goal is to provide a vision statement. Each shop will decide it's own requirement but I general recommend a condensed vision statement that can be easily hung on the wall as an "Information Radiator" (Remember...big visible charts). The Vision should be agreed upon by all stakeholders (chickens) and written without concern for technology. It should also convey some sense of purpose, since it is intended to drive the development process.

Once you have a Vision your team of chickens is ready to start working on The Roadmap. The Roadmap should also be brief enough to work well as a Big Visible Chart. Ideally your developers will keep the roadmap hung up in the shop. As items on the Roadmap are completed they can be checked off. The Roadmap will list, in order of importance, the vision for future releases. I recommend a bulleted list with each release defined by no more than 2 or 3 sentences (1 is better). As with the vision the roadmap should have a sense of purpose and as each item on the roadmap is completed you'll likely want to revisit the vision (it may have changed).

With your Roadmap written you'll want to start planning your Next Release. Planning a release is no simple task. You'll want the input of all your chickens and pigs. I'd recommend, first, relooking/revamping the vision for the next release; something may have changed. During release planning the chickens should be presenting stories to the pigs. These stories fall into 3 categories; enhancements, new features, bugs. Chickens should only consider enhancements and new features that fit into the vision for this release. Any stories considered a bug may be considered, regardless of the vision. Beware what you call a bug....just because you "don't like the way something works", does not make it a bug.

During Release Planning you should play the planning game. Chickens write a story, pigs estimate, chickens rewrite.... The final result should be a set of estimated stories with user values and generally well described acceptance tests. This may take several days. Pigs should feel free to throw back any story that can't be estimated. Chickens should make sure to stay within the vision for this release (or revise it).

Once you've hammered out your Release Plan, you're ready to begin working on the Next Iteration. Your original estimates of a story should not change unless the story is changed. Tracking progress should be done via your velocity (that is outside the scope of this article). At the beginning of each Iteration you'll select a set of stories to complete. This should be a sort of "do or die trying" commitment on the part of the Pigs. If you've tracked your velocity properly then making a commitment should be easy. Put story cards somewhere visible or use a large chart to track progress. No story should be started without an acceptance test and no story is complete until the test passes.

As each iteration progresses you'll constantly be planning The Next Day's work. Each day engineers/developers should pair up and select a story to work on. Pigs will sometimes hold design sessions with CRCs or write engineering tasks to remind themselves what needs to be done next. A coach should be available to cheer them on as the stories slowly disappear and the new builds start to appear. Always make big visible charts and information radiators to track progress of tests, builds and stories.

As you get into a rhythm your team should start to move through these stages naturally. For many companies distractions are a way of life. Try your hardest to insulate developers from these distractions and you'll see your vision realized much faster. If distractions are unavoidable them just realize that your velocity will suffer. As long as distractions are common your velocity should adequately adjust for them.

Monday, January 01, 2007

Problems with modules2 and CVSNT....
CVSNT has an interesting feature not found in the generic cvs server...."modules2". I've been using this feature for quite some time. It offers some interesting options when compared to the standard modules feature in cvs (and cvsnt). I am not going to cover all of it's features since those can be found on the cvsnt web site....suffices to say that modules2 allows you to define alias modules in cvs using a sort of regex language of expressions.

For a long time I had my guys using modules2 strictly for checkouts done by our build server. We could wrap up all the dependencies in cvs under a single "alias" module. So far one downfall has been that modules2 modules don't support the history command :P Oh well, it still made managing dependencies easy. I've always shied away from letting engineers use modules2 locally because of potential issues with check-ins and creating new modules. Recently some of the engineers (myself included) have started working directly from modules2 sandboxes. It's actually scary. Where do the new modules you create get built? What contracted agreement ensures the behavior will never change? In general most other cvsnt functionality is well documented and has a great deal of precedence to fall back on....modules2 has neither and it may be a mistake to depend on it when it really offers little more than convenience.

We have run into some trouble. One tool we use, pushok, to integrate cvs into Visual Studio tried to create a new module at the root of the modules2 module. Nothing was actually created. Problem is, what should the behavior be? How should these tools behave?

Has anyone else had trouble with modules2? Am I over-reacting? Right now it's my intent to switch our build servers over to using flat module checkouts. This will bring back the history command and speed things up. I am going to discuss the use of modules2 sandboxes the team and try to achieve some sort of consensus.

Update on 3 Jan 2007
Well, I've gone threw and replaced our modules2 definitions with a set of batch files for checking out our various "sandboxes". The batch files look something like this:
SET CVSROOT=:sspi:cvs.foobar.com:/FOO
SET NEWMODULEPATH=FOO.BAR.ALIAS
mkdir %NEWMODULEPATH%
cd %NEWMODULEPATH%
cvs co -P project1
cvs co -P project2
cvs co -P foobar34
cvs co -P common/lib
cvs co -P common/src

So far it's working out pretty well! To check out a set of modules we just run our batch file! More to follow...