# Mind in the Water

adrift in the sea of experience

## Tuesday, May 10, 2011

### Problems encountered when using MEF as a dependency injection container

When we started using MEF as a dependency injection container, I figured the trade-off was a bit like this. Good: Already part of the .NET 4 framework, dynamic discovery of components for extensibility. Bad: missing some advanced features like AOP and parametrized construction. Glenn Block posted about this in Should I use MEF for my general IoC needs?.

Since we didn't need those features, MEF seemed like a good choice. But as it turns out, there is a more subtle problem when using MEF as a general purpose dependency injection container. Consider the following example:

   public class Program
{
static void Main(string[] args)
{
var container = new CompositionContainer(
new AssemblyCatalog(Assembly.GetExecutingAssembly()));
var a = container.GetExportedValue<A>();
}
}

[Export]
public class A
{

[ImportingConstructor]
public A(B b)
{
this.b = b;
}
}

[Export]
public class B
{

[ImportingConstructor]
public B(C c)
{
this.c = c;
}
}

public class C
{
}


The export attribute is missing on the C class in my example. Since class A indirectly depends on class C, we get an error when we try to get an A instance from the container:

System.ComponentModel.Composition.ImportCardinalityMismatchException was unhandled
Message=No valid exports were found that match the constraint '((exportDefinition.ContractName == "DITest.A") AndAlso (exportDefinition.Metadata.ContainsKey("ExportTypeIdentity") AndAlso "DITest.A".Equals(exportDefinition.Metadata.get_Item("ExportTypeIdentity"))))', invalid exports may have been rejected.
Source=System.ComponentModel.Composition
StackTrace: ...


So the C export is missing, but surprisingly the error message is complaining about the A class! This is because of "stable composition". Basically, whenever a dependency for a certain part is missing, MEF will resiliently attempt to do the composition without that part. For an example of how that can be useful, see Implementing Optional Exports with MEF Stable Composition.

Useful as it may be, stable composition comes at a price. Since MEF can't tell the difference between critical and non-critical parts, the missing dependency error may cascade upward until you get a mysterious ImportCardinalityMismatchException like the one above.

The MEF documentation on Diagnosing Composition Problems acknowledges this and provides some hints on how to debug such problems. Still, for large compositions the process is far from pleasant. Things are even worse if you have some circular dependencies.

Perhaps it would be better to use Autofac to do the core application composition. It doesn't attempt dynamic discovery or stable composition, so the error messages point straight at the missing dependencies. And with the MEF integration, you can still use MEF for the parts of the application where you do need dynamic discovery and stable composition.

## Saturday, May 7, 2011

### DownMarker, a Markdown editor/viewer

I've released version 0.1 of DownMarker, a small desktop application that can be used to to view, navigate and edit a set of interlinked markdown files:

In case you don't know Markdown, it is a "easy-to-read, easy-to-write plain text format" intended to be converted to HTML. I first encountered it as the format used at Stackoverflow for editing questions and answers. In fact, DownMarker is based on MarkdownSharp, a library generously released as open source by the stackoverflow team.

Currently some things are still missing, but my hope is that DownMarker will work well for creating light-weight wikis inside your version controlled projects, or anything else that can store files. Let me know if you find it useful!

## Tuesday, March 1, 2011

### MEF attribute-less registration

The Managed Extensibility Framework has shipped in .NET4, but the work hasn't stopped: the recent MEF2-Preview3 release added some interesting stuff. Let's take a look at attribute-less registration.

First, a recap of the existing attribute-based registration mechanism (or "Attributed Programming Model" in MEF-parlance):

A composition container can take different kinds of export providers. The diagram shows the catalog-based export provider. TypeCatalog is the most important catalog implementation: it is based on a list of types, which it inspects via reflection for MEF attributes. The other implementations are there for convenience, to quickly build type catalogs out of an assembly (AssemblyCatalog), an entire directory (DirectoryCatalog), or other catalogs (AggregateCatalog).

Here is an illustration of the above with a quick code example featuring some dummy types. (The CatalogExportProvider is not visible because the CompositionContainer can conveniently construct it for us if we directly pass it a catalog.)
interface IFoo { }
interface IBar { }

[Export(typeof(IFoo))]
class Foo : IFoo
{
[Import(typeof(IBar))]
public IBar Bar { get; set; }
}

[Export(typeof(IBar))]
public class Bar : IBar { }

class Program
{
static void Main(string[] args)
{
var catalog = new TypeCatalog(typeof(Foo), typeof(Bar));
var container = new CompositionContainer(catalog);
var foo = container.GetExportedValue<IFoo>();
}
}

For an attribute-free alternative to the above, you might expect that the new MEF release would contain an alternative ExportProvider or perhaps a new ComposablePartCatalog implementation, which would take its information from configuration in code instead of reflection. Surprisingly, that's not how it was done!

In reality, the attribute-less registration is implemented by adding information right above the reflection level! We essentially inject fake reflection information into our catalogs to simulate the presence of attributes. The attribute-free version of the above example looks like this:

interface IFoo { }
interface IBar { }

class Foo : IFoo
{
public IBar Bar { get; set; }
}

class Bar : IBar { }

class Program
{
static void Main(string[] args)
{
var registration = new RegistrationBuilder();

registration.OfType<Foo>()
.ImportProperty<IBar>(property => property.Name == "Bar")
.Export<IFoo>();

registration.OfType<Bar>()
.Export<IBar>();

var catalog = new TypeCatalog(
types: new Type[] { typeof(Foo), typeof(Bar) },
reflectionContext: registration);
var container = new CompositionContainer(catalog);
var foo = container.GetExportedValue<IFoo>();
}
}


Note how the TypeCatalog takes a new ReflectionContext parameter; that's were the reflection info is injected. An interesting result of this approach is that you can leverage the existing catalogs to register types in bulk. For example, the following example will export all the types which implement IPlugin in a given directory, without using any attributes:
var registration = new RegistrationBuilder();
registration.Implements<IPlugin>().Export<IPlugin>();
var catalog = new DirectoryCatalog("./plugins", "*.dll", registration);


This is just what I've gleaned from a quick peek at the unit tests in the new MEF release. There is much more to the RegistrationBuilder API which I haven't shown here. I'm sure there will be more documentation on the MEF codeplex site shortly. Also, keep in mind that the APIs in preview releases are subject to change.

UPDATE: The MEF product manager, Hamilton Verissimo (aka "hammet" and "haveriss"), has blogged about MEF's convention model.

## Friday, February 25, 2011

### Executing visual studio 2010 unit tests without installing visual studio

In a previous post I already explained how to get your unit tests running on a build machine with mstest.exe, without having to install Visual Studio 2008. We recently upgraded to Visual Studio 2010 so I had to repeat the exercise.

This time I wrote a batch script which does most of the work. Just run the script on a machine where Visual Studio 2010 is installed, and it will create a "mstest" folder with the necessary binaries and registry entries. Copy the folder to your build machine, prepare the registry by executing mstest.reg and you're good to run mstest.exe like this:

mstest /noisolation /testcontainer:"path\to\testproject.dll"

This will return an error code if any of the tests fail.

As an anonymous commenter on my previous post pointed out, you should be careful of the license implications. In our case it is most likely OK because we have a site license. But even so it can be useful to avoid a full Visual Studio install: it saves a lot of disk space, and you don't have to spend time babysitting the installation. Or multiple installations, if you have a cluster of build machines!

update: thanks to Frederic Torres for pointing out that the script doesn't work on a 64-bit Windows, and suggesting a fix! It should work now. Apparently there are still problems with 64-bit machines. But I don't have a 64-bit machine without VS2010 to test with, so I'll have to leave it as an exercise for the reader...

## Wednesday, December 1, 2010

### bisect your source code history to find the problematic revision

I am working on a little application to edit Markdown documents with instant preview. (For those of you who aren't into building things from source, here's a windows installer to put a downmarker shortcut in your start menu.) I like to use both windows and linux so I try to make sure that it runs on both Microsoft.NET and Mono.

When I last tested my application on Mono, I discovered a very annoying problem: each time I tried to type something in a markdown document, a new nautilus file browser window would be launched! I had absolutely no idea why this would happen or where to start debugging. To make matters worse, I hadn't tested on Mono for about 40 revisions so there were a lot of possible changes that might have introduced the problem.

I have the source code history in a mercurial repository, so I decided to give the "bisect" feature a try. I started by marking the latest revision as "known bad":

I also remembered making some mono-specific bug fixes, and the problem didn't exist at that point. So I grepped the output of hg log for commit messages containing the word "mono" and marked the revision as good:

hg bisect --good 627d6

At this point the bisect command has a revision range to work with. It will automatically update your working copy to a revision right in the middle of that range. You just have to rebuild your application, verify whether the problem is still there, and mark the revision as good or bad. (It is also possible to do this test automatically with a script.) The bisect command will then automatically cut the revision range to investigate in half again, and select yet another revision in the middle, etcetera.

Sure enough, after about 5 or 6 tests I got this message:

changeset:   42:c4bbabe79dde
user:        wcoenen
date:        Sun Nov 07 23:31:32 2010 +0100
summary:     Links are now handled by opening .md or by external application.

Aha! Looks like I added some code to handle link clicks by passing the URL to the OS (to open it with a suitable external application). And obviously this code is now being triggered erroneously on mono whenever the document is updated. Thank you bisect!

## Wednesday, November 24, 2010

### git error: error setting certificate verify locations

I was trying to clone a repository from github when I ran into this error:
Cloning into docu...
error: error setting certificate verify locations:
CAfile: \bin\curl-ca-bundle.crt
CApath: none
while accessing https://github.com/jagregory/docu/info/refs

If you google around, many people "solve" this by disabling the SSL certificate check entirely. Obviously there is a reason for that check, so disabling it is not quite the right solution! It turns out that there is mistake in the gitconfig file that comes with msgit setup (I have Git-1.7.2.3 installed). The right fix is to change the sslCAinfo setting in "c:\program files\git\etc\gitconf" to this:

sslCAinfo = c:\\program files\\git\\bin\\curl-ca-bundle.crt

## Tuesday, November 23, 2010

### Mercurial and subversion subrepos

It is not yet mentioned in the Mercurial book, but Mercurial has a subrepos feature to pull code from other repositories into your project as a nested repository. It is a bit similar to SVN externals and git submodules.

Better yet, it also works with subversion! There are still some bugs to be worked out though: you better not move your SVN subrepo around in your mercurial repository. For all the ugly details, see my bug report.