My first Docker image

You probably heard about Docker by now. Me too. But it took me long to jump on that bandwagon (as a .NET developer). And you probably heard about NoSQL too :)

Recently I discoverd RethinkDB

The open-source database for the realtime web

that look really cool and I decided to explore it a little. Since I know myself and I tend to spend ages on setting up on a piece of software (i.e. trial & error approach with heavy help of Google) and in the end it crashes completely I decided to try Docker as well. But … since I am now using Windows 10 and boot2docker using VirtualBox does not work here and VMWare provider for Vagrant is $78 and … stop … end of excuses: I chose CentOS image running in VMPlayer.

So I got CentOS image up and running with docker daemon running, phusion/baseimage-docker image added (because it is special and fixes some stock Ubuntu 14.04 base image issues).

The base file is

# Use phusion/baseimage as base image. To make your builds reproducible, make
# sure you lock down to a specific version, not to `latest`!
# See   for
# a list of version numbers.
FROM phusion/baseimage:<VERSION>

# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]

# ...put your own build instructions here...

# Clean up APT when done.
RUN apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*

Google docker rethinkdb and the second hit is this GitHub repository. So I took the jessie~2.0.4 (the latest version of course) dockerfile

FROM debian:jessie

MAINTAINER Stuart P. Bentley 

# Add the RethinkDB repository and public key
# "RethinkDB Packaging "
RUN apt-key adv --keyserver --recv-keys 1614552E5765227AEC39EFCFA7E00EF33A8F2399
RUN echo "deb jessie main" > /etc/apt/sources.list.d/rethinkdb.list


RUN apt-get update \
	&& apt-get install -y rethinkdb=$RETHINKDB_PACKAGE_VERSION \
	&& rm -rf /var/lib/apt/lists/*

VOLUME ["/data"]


CMD ["rethinkdb", "--bind", "all"]

#   process cluster webui
EXPOSE 28015 29015 8080

and pasted the code into my base docker file, tried to build the image and got error

The following packages have unmet dependencies:
rethinkdb : Depends: libprotobuf9 but it is not installable
            Depends: libstdc++6 (>= 4.9) but 4.8.4-2ubuntu1~14.04 is to be installed
E: Unable to correct problems, you have held broken packages.

So I google around and found nothing. Then I decided to try to run database from binaries as described on RethinkDB page

source /etc/lsb-release && echo "deb   $DISTRIB_CODENAME main" | sudo tee /etc/apt/sources.list.d/rethinkdb.list
wget -qO- | sudo apt-key add -
sudo apt-get update
sudo apt-get install rethinkdb

And that worked!

That code in docker file however results in error

/bin/sh: 1: source: not found

which can be easily fixed by running this command in docker file

rm /bin/sh && ln -s /bin/bash /bin/sh

That will install RethinkDB startup service and start it! And the administration UI is accessible at http://localhost:8080!

During the installation I noticed different version being installed this time – 2.0.4~0trusty, then I changed RETHINKDB_PACKAGE_VERSION to 2.0.4~0trusty and the database installation succeeded (but the service was not started)

invoke-rc.d: policy-rc.d denied execution of start.

Since I am not a Unix expert I just ignored this add added these lines to use the default configuration (I don’t know how to use external file in docker file)

RUN cp /etc/rethinkdb/default.conf.sample /etc/rethinkdb/instances.d/instance1.conf

In the end that did not work :( and I went back to the working version (the one with source). Which did not work either, no connection was made to RethinkDB Administation Console and then it hit me! The container is not running

docker run -d -p 8080:8080 -p 28015:28015 -p 29015:29015 dockerfile

And it worked! ­čśÇ RethinkDB Administation Console

The final version of the dockerfile is here

FROM phusion/baseimage:0.9.17

# fixes 'source: not found'
RUN rm /bin/sh && ln -s /bin/bash /bin/sh

# Use baseimage-docker's init system.
CMD ["/sbin/my_init"]

# ...put your own build instructions here...
RUN apt-get update 

RUN apt-get install -y wget 

# Add the RethinkDB repository and public key
RUN apt-key adv --keyserver --recv-keys 1614552E5765227AEC39EFCFA7E00EF33A8F2399
RUN echo "deb trusty main" > /etc/apt/sources.list.d/rethinkdb.list


RUN apt-get update \
	&& sudo apt-get install -y rethinkdb=$RETHINKDB_PACKAGE_VERSION 

RUN cp /etc/rethinkdb/default.conf.sample /etc/rethinkdb/instances.d/instance1.conf

VOLUME ["/data"]


CMD ["rethinkdb", "--bind", "all"]

# process cluster webui
EXPOSE 28015 29015 8080

# Clean up APT when done.
RUN apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
rethinkdb_dockerfile.txtdownload file

NInject problem with remote proxies

I have run into following issue while working with NInject:

System.TypeInitializationException: The type initializer for 'ClassA' threw an exception. --->    
System.Runtime.Remoting.RemotingException: Attempted to call a method declared on type 'Ninject.IInitializable' on an object which exposes 'ClassB'.
Server stack trace:
  at System.Runtime.Remoting.Messaging.StackBuilderSink.VerifyIsOkToCallMethod(Object server, IMethodMessage msg)
  at System.Runtime.Remoting.Messaging.StackBuilderSink.SyncProcessMessage(IMessage msg)

Exception rethrown at [0]: 
  at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
  at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
  at Ninject.IInitializable.Initialize()
  at Ninject.Activation.Strategies.InitializableStrategy.b__0(IInitializable x) in InitializableStrategy.cs:line 28

The problem is that NInject tries to call Ninject.IInitializable.Initialize() on remoting proxy (ClassB) and that fails since that class does not implement that interface. This call is made by one of the activation strategies in the pipeline.

The workaround is (inspired by Ydie’s blog post) to create new kernel class derived StandardKernel, remove all instances of IActivationStrategy from Components collection, return some of the default strategies back and add my own strategy that does not try to invoke Initialize method when the object being activated is instance of MarshalByRef class.

I don’t have any class implementing IStartable interface, so I’m not re-adding the StartableStrategy, but for complete fix that strategy should also be modified. Or the pipeline can be changed to completely ignore remoting proxies …

public class MyStandardKernel : StandardKernel
    protected override void AddComponents()
        // remove the all activation strategies
        // add some of the default strategies back (code copied from NInject sources)

        // I don't need this
        // Components.Add();
        // this is the new strategy

public class RemotingProxyAwareInitializableStrategy : ActivationStrategy
    /// Initializes the specified instance.
    /// The context.
    /// A reference to the instance being activated.
    public override void Activate(IContext context, InstanceReference reference)

        reference.IfInstanceIs(x => x.Initialize());

NInject modularity let’s you to replace different core components and that’s really great. I’m looking forward to another problem :)

Roslyn powered live code analyzer

First there was a problematic WPF binding property, then I had to check all binding properties and then I thought about using FxCop to do that dirty job for me. But unfortunately FxCop is no longer developed and supported. That made me a little bit sad, since I really liked that tool and its power and usefulness.

But then I found this article by Alex Turner Use Roslyn to Write a Live Code Analyzer for Your API and after reading that I no longer mourned for FxCop. The new Roslyn powered while-you-type code analysis and error reporting is incredible!

And I decided to write my own DiagnosticAnalyzer rule. The rule I implemented is the good old CA2000: Dispose objects before losing scope. It is already there among other FxCop rules in the Microsoft.CodeAnalyzers.ManagedCodeAnalysis rule set. But I just wanted to try it and find out how difficult it is to create such rule.

First I installed required VSIX packages, then created project with given template, copied the code from MSDN page and started the investigation on how to check if the object is disposed after all references to the object are out of scope.

First I created a dummy class with both invalid and valid usages of a disposable objects.

This is the code of DummyClass class I used for testing (test project is generated using the VS template and added to the solution).

´╗┐using System.IO;
using System.Text;

class DummyClass
    private Stream _s;

    public DummyClass()
         * almost correct usage: 
         * value of field _s must be disposed later 
         * (maybe the rule can suggest to implement IDisposable interface) 
        _s = Create();

         * correct usage: 
         * assigning IDisposable inside using block to variables
        using (Stream a = Create(), b = Create()) { }

         * correct usage: 
         * assigning IDisposable inside using block to a previously declared variable 
        Stream c;
        using (c = Create()) { }

         * incorrect usage: 
         * not using using statement for declaration and initialization of a IDisposable variable 
        var d = Create();

         * these lines were added just to prove that the rule is ignoring non-IDisposable variables
        var sb = new StringBuilder(); // declaration and initialization of a non-IDisposable variable  
        StringBuilder sb2;
        sb2 = new StringBuilder(); // assigning non-IDisposable to a previously declared variable

    Stream Create()
        return null; // the real value is not important, return type is

    public void Method()
         * incorrect usage: 
         * not using using statement for declaration and initialization of a IDisposable variable 
        var stream = new MemoryStream();

Note: I have found it very useful to keep the sample code in a separate compilable file and not in string variable in a test method. The advantage is that you know the code is valid and it is easier to locate the reported error (unless you’re debugging your project running sand-boxed Visual Studio).

Roslyn Syntax Visualizer helped me to identify the nodes I have to check, those are

  • VariableDeclaration – for example var a = DisposableObject();, note that there can be more then one variable being assigned
  • SimpleAssignmentExpression – for example a = DisposableObject();

Action has to be registered to trigger the analysis after the semantic analysis of those syntax nodes is completed

public override void Initialize(AnalysisContext context)
    context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.VariableDeclaration);
    context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.SimpleAssignmentExpression);

Note: I used one callback action for both syntax nodes, but you can register one for each node and make the code cleaner.

And the final result is here – the basic idea is to check the type of RHS node if it implements IDisposable and if it does then check if that is happening inside using block. With one exception when the value is assigned to class field.

´╗┐using System.Linq;
using System.Collections.Immutable;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp;
using Microsoft.CodeAnalysis.CSharp.Syntax;
using Microsoft.CodeAnalysis.Diagnostics;

namespace Dev5.CodeFix.Analyzers
    public class DisposeObjectsBeforeLosingScopeRule : DiagnosticAnalyzer
        public const string DiagnosticId = "DisposeObjectsBeforeLosingScopeRule";
        internal const string Title = "Dispose objects before losing scope";
        internal const string MessageFormat = "A local object of a IDisposable type is created but the object is not disposed before all references to the object are out of scope.";
        internal const string Category = "Reliability";
        internal static DiagnosticDescriptor Rule = new DiagnosticDescriptor(DiagnosticId, Title, MessageFormat, Category, DiagnosticSeverity.Error, isEnabledByDefault: true);

        public override ImmutableArray<DiagnosticDescriptor> SupportedDiagnostics
            get { return ImmutableArray.Create(Rule); }

        public override void Initialize(AnalysisContext context)
            context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.VariableDeclaration);
            context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.SimpleAssignmentExpression);

        /// <summary>
        /// Gets type symbol for System.IDisposable interface.
        /// </summary>
        /// <param name="compilation"></param>
        /// <returns></returns>
        public static INamedTypeSymbol IDisposable(Compilation compilation)
            return compilation.GetTypeByMetadataName("System.IDisposable");

        /// <summary>
        /// Returns boolean value indicating if the <paramref name="typeInfo"/> implements System.IDisposable interface.
        /// </summary>
        /// <param name="typeInfo">TypeInfo to check</param>
        /// <param name="compilation"></param>
        /// <returns></returns>
        private static bool IsDisposable(TypeInfo typeInfo, Compilation compilation)
            if(typeInfo.Type == null)
                return false;
            return !typeInfo.Type.IsValueType && typeInfo.Type.AllInterfaces.Any(i => i.Equals(IDisposable(compilation)));

        private void AnalyzeNode(SyntaxNodeAnalysisContext context)
            var semanticModel = context.SemanticModel;
            var compilation = context.SemanticModel.Compilation;
            // are we inside using block? i.e. is the Parent of current node UsingStatement
            var insideUsingStatement = context.Node.Parent is UsingStatementSyntax;

            var declaration = context.Node as VariableDeclarationSyntax;
            // variable declaration node
            if (declaration != null)
                // more than one variable can be declared
                foreach (var declarator in declaration.Variables)
                    var variable = declarator.Identifier;
                    var variableSymbol = semanticModel.GetDeclaredSymbol(declarator);
                    var eq = declarator.Initializer as EqualsValueClauseSyntax;
                    var varTypeInfo = semanticModel.GetTypeInfo(eq?.Value);
                    // non-disposable variable is declared or currently inside using block
                    if (!IsDisposable(varTypeInfo, compilation) || insideUsingStatement)

                    // report this
                    context.ReportDiagnostic(Diagnostic.Create(Rule, declarator.GetLocation()));

            var assignment = context.Node as AssignmentExpressionSyntax;
            if (assignment != null)
                // does the type of the RHS node implement IDisposable?
                var typeInfo = semanticModel.GetTypeInfo(assignment.Right);                
                if (!IsDisposable(typeInfo, compilation))

                var identifier = assignment.Left as IdentifierNameSyntax;
                var kind = semanticModel.GetSymbolInfo(identifier).Symbol;
                // assigning field value or currently inside using block
                if (kind?.Kind == SymbolKind.Field || insideUsingStatement)

                // report this
                context.ReportDiagnostic(Diagnostic.Create(Rule, assignment.GetLocation()));
DisposeObjectsBeforeLosingScopeRule.csview rawview file on GitHub

During the development of the analysis rule I found that the syntax visualizer is really really helpful (but if you have it installed it is good to Pin it for the sample code only, VS was quite sluggish when I switched tabs to the rule file). Google is also very helpful (as usual) since I was struggling to get the type and symbol infos.

But overall I am super excited about this functionality, writing the rules is not that difficult, the live analysis is pretty impressive and the possibilities are infinite!

The code is available on GitHub.

How Powershell helped me to solve the assembly conflict

In one of my test projects the tests suddenly started to fail. Which is a bad thing. What was was worse and strange was the reason why they were failing

System.TypeInitializationException : The type initializer for 'RMReportingPortalDataLayer.Strategies.ReportDeliveryStrategy' threw an exception.
----> System.IO.FileNotFoundException : Could not load file or assembly 'log4net, Version=, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a' or one of its dependencies. The system cannot find the file specified.}}

First I checked the bin directory and the assembly was not there. Then I checked the build log and found this line

No way to resolve conflict between "log4net, Version=, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a" and "log4net, Version=, Culture=neutral, PublicKeyToken=692fbea5521e1304". Choosing "log4net, Version=, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a" arbitrarily.

Ok, there is a conflict between incompatible versions of log4net assembly. But which of the assemblies referenced in my project are suspects? How to check all the referenced assemblies of references assemblies? It’s task nobody would like to perform manually. I like Powershell so I googled around and found this post and made few changes – build the hash and write the formatted output to console:

$hash = $references | Group-Object Name, Version -AsString -AsHashTable

$hash.GetEnumerator() | Sort-Object Name | % { 
  $key = $_.Key.ToString().Trim()
  $value = $_.Value
  Write-Host $key
  $s = [string]::join([Environment]::NewLine + '   * ', ($value | Select-Object -ExpandProperty Who | Get-Unique | Sort-Object Who))
  Write-Host '   *', $s

Which produces nice output and all is clearer now :)

* Lib.Shared.Admin, Version=, Culture=neutral, PublicKeyToken=3941ae83427745cf
* Lib.Shared, Version=, Culture=neutral, PublicKeyToken=3941ae83427745cf
* Lib.Common, Version=, Culture=neutral, PublicKeyToken=null

I was referencing older version of Lib.Common library, so I updated it to latest version and problem was solved!

Note that the Powershell loads all the assemblies in the folder and keeps them loaded until the Powershell window is closed. Or you can spawn new Powershell process:

$command = [System.IO.File]::ReadAllText("path to dependencies.ps1")
$bytes = [System.Text.Encoding]::Unicode.GetBytes($command)
$encodedCommand = [Convert]::ToBase64String($bytes)
powershell -NoProfile -EncodedCommand $encodedCommand # |%{$_}

The script is not perfect and you have to change the path to bin directory …

TypeScript build system for Sublime Text 2

Recently I decided to create a small Node.JS project. And I like strongly typed languages. I know I can’t get the full strong typing in JavaScript and having the power of it. So I decided to use TypeScript for the project. I chose Sublime Text 2 as an editor, I am used to use Visual Studio but in I wanted to take a grip of an editor that as I read “everyone loves”.

After downloading all the required bits I got to build (Ctrl+B) the project and ran into this error:

error TS5037: Cannot compile external modules unless the '--module' flag is provided.

This is easy to fix when you are building TypeScript project from command line:

tsc --module commonjs app.ts

This is the original typescript.sublime-build file I downloaded from the Internet:

  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "$file"],
  "file_regex": "(.*\\.ts?)\\s\\(([0-9]+)\\,([0-9]+)\\)\\:\\s(...*?)$",
  "selector": "app.ts"

and the fix is very simple (but it took me quite a while to figure it out) – just add new items ("--module" and "commonjs") to cmd array:

  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "--module", "commonjs", "$file"],
  "file_regex": "(.*\\.ts?)\\s*\\(([0-9]+)\\,([0-9]+)\\)",
  "selector": "app.ts"

I had to modify the file_regex too, because it was not working.

How to Use SVG Glyphs in XAML

Using SVG glyphs on web pages for custom icons is brilliant (see Font Awesome, GlyphIcons, …). The SVG glyphs are encoded using path expression to describe the vector graphics.

But wait! You can use the path expression to define a graphic element in XAML (which means WPF, Silverlight, WindowsPhone) too!

If you want to use the SVG glyphs in XAML here is a simple manual:

  1. download the font
  2. find the icon you want use (for example FontAwesome Suitcase icon  ) and its content value ("\f0f2")
  3. find the corresponding glyph in svg file

    <glyph unicode="&#xf0f2;" horiz-adv-x="1792" d="M640 1152h512v128h-512v-128zM288 1152v-1280h-64q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h64zM1408 1152v-1280h-1024v1280h128v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h128zM1792 928v-832q0 -92 -66 -158t-158 -66h-64v1280h64q92 0 158 -66 t66 -158z" />

  4. add new GeometryGroup with child PathGeometry to Resources in the XAML, set the value of PathGeometry.Figures property to d value of the glyph

    <GeometryGroup x:Key="Suitcase">
    <PathGeometry Figures="M640 1152h512v128h-512v-128zM288 1152v-1280h-64q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h64zM1408 1152v-1280h-1024v1280h128v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h128zM1792 928v-832q0 -92 -66 -158t-158 -66h-64v1280h64q92 0 158 -66 t66 -158z" />

  5. Create DrawingImage. The geometry drawing must be turned upside down (see line 13). I googled around, but didn’t find out why.

     <DrawingImage x:Key="SuitcaseImage"> 
    <GeometryDrawing Geometry="{StaticResource SuitcasePath}"> 
    <Pen Thickness="5" Brush="Black" /> 
    <ScaleTransform ScaleY="-1" /> 

  6. use the DrawingImage as a source for an Image

    <Image Source="{StaticResource SuitcaseImage}" Width="150" Height="150" />

This is very simple approach how to use the glyphs in XAML. It does not allow you to easily reuse the icon with different colors (but you can change the line brush color on line 7 or set the Path.Fill brush) or it does not allow easy composition of icons (but you can add more GeometryDrawing to DrawingGroup.Children collection on line 5).

The complete code is available here, the result in Kaxaml editor: suitcase_screen

Building solution with multiple NuGet package projects on TeamCity

Let’s say you have a solution with projects in it and some projects represents NuGet packages. I know it is not ideal, but the NuGet package project are really small and I didn’t want to create solution for each individual project. And also they are share some code. And I didn’t want to publish all the packages when I made changes to one project only not affecting the others

A good example is a solution with UI controls – each project representing a group of UI elements or a single element. Like set controls from a big controls vendor – I imagine the solution is quite big with projects like grid control, map control, charting, etc. By adding a NuGet package projects you can keep these distributed separately – one package for grid (ComponentVendor.UI.Grid), one package for maps (ComponentVendor.UI.Maps), etc. End-users do not have to download one big massive package (ComponentVendor.UI) which will make them to delete the unused references later.

The solution structure

  Maps.NuGet (references Maps)
  Grid.Nuget (references Grid)

The first step was to create a TeamCity project for the solution and build configurations for:

  • root – build the entire solution (that’s why you set up the CI build) with VCS root pointing to source control root of the solution and with UI.sln as build file
  • all packages – manually triggered distribution all the packages at once
  • each NuGet package project

Setting up the build configuration for NuGet package project:

build steps

  • create new VCS root that points to the project source control directory to “isolate” this project from changes made in other projects
  • add VCS root checkout rule: +:.=>ProjectDirectory, this will checkout the sources into this directory NuGet build step
  • set MSBuild build file to .csproj file
  • add build step to create NuGet package with .csproj file as specification file and ProjectDirectory as base directory NuGet build step

The build and publication of the package is triggered by changes made to package project, for example by changing the package version.

Sharing NuGet repository across multiple solutions II

Second step: Fixing the HintPath of referenced assemblies

All distributed assemblies from the NuGet package are added as references to the project. NuGet sets the HintPath of the reference with relative path (relative to the packages directory). When using a shared repository as described in previous post this brings some problems.

Problem with CI builds

The shared repository is stored somewhere on your machine and all project references added by NuGet should point there. You usually don’t clone your development machine directory structure on your CI server. The CI build will fail because the assemblies were not found (but they are there in the directory).

Problem with package restore

Even NuGet package restore feature won’t help because it only downloads the missing packages and places them by default under packages directory in solution directory. But it does not change the HintPaths of referenced assemblies. You can’t event set the MSBuild project property AdditionalLibPath because you don’t know all the directories where the package DLLs are placed beforehand (of course I can create a build task).

The solution to this is very easy, I found this discussion where a build property is used in HintPath. I had to extend it a little to support both local and CI builds and turn the package restore on.

 <PackagesDirectory>path to shared packages directory</PackagesDirectory>
<PropertyGroup Condition="Exists('$(PackagesDirectory)')">
<PropertyGroup Condition="!Exists('$(PackagesDirectory)')">

This setting covers both features – local development and CI build. Value of NuGetPackagesDirectory property will be set to the path to the shared directory (verified by Condition="Exists('$(PackagesDirectory)')") on local machine. On CI server this directory might or might not exist. If it does not exist it will be set to a packages directory under solution folder (used by package restore).

And the reference to NuGet package DLL

<Reference Include="NLog">

The branch and changset details are available here.

Sharing NuGet repository across multiple solutions

I’ll start with be summary BenPhegan posted to a thread I started on NuGet CodePlex site.

  1. Developers are historically used to having a local common library location for all their binary dependencies (ie all hintpaths reference “d:\common” or something similar).
  2. Often this library is controlled centrally and xcopy/robocopied to all developers machines to provide commonality of build capability, generally the central copy come from a CI build system.
  3. There are a lot of different projects that a developer works on with a lot of dependencies, and it is seen as efficient to have a single copy of these dependencies on disk.
  4. Project files are included from multiple solutions at arbitrary locations on disk, resulting in mismatched relative hintpaths when using NuGet.

The problem (not the only one) with shared packages directory is that every project that ever loaded a NuGet package creates a record in repositories.config file. You end up with this file containing lots of these. Also some packages directories might get deleted when a package is removed.

First step: Solution-level repositories.config file

My first approach to an “enterprise” NuGet was to get rid of the repositories.config file that is located under packages directory. All the packages.config files can be discovered by iterating through all projects in the solution:

const string fileName = "packages.config";
foreach (var project in solutionManager.GetProjects())
  var projectItem = project.FindProjectItem(fileName);
  if(projectItem == null)

  // we have the packages.config file here

But then I thought about it and got the idea that I can make repositories.config file local to a solution, i.e. put it in solution scope. And the best place to put NuGet specific file is the .nuget directory. The entries in config file are added with path relative to solution folder.


Now there’s one repositories.config file for every solution containing the records for every available packages.config. Package files in shared directory can be still deleted when the package in repository is no longer needed (this feature can be turned off and shared repository must be cleared manually).

Vote for it on NuGet!

I tried to avoid big changes in NuGet codebase and I was slowly getting to understand how NuGet internals works. Next step was to force NuGet to use the shared packages repository. I changed some classes (get the settings from registry) to find out the only thing I have to do was to add a node to C:\Users\(your name)\AppData\Roaming\NuGet\NuGet.config configuration file. Don’t forget to remove any repositoryPath from a Nuget.config file under your solution directory if you have one.

  <add key="repositoryPath" value="path to your repository"/>

The branch and changset details can be viewed on CodePlex here.

The description of how to build (locally and CI) a solution with project(s) referencing assemblies in a shared local directory will follow.

Data-driven Knockout views II

Last time I posted a code snippet with description of data driven Knockout views. I have modified and simplified it and

  • added support for forEach binding and
  • the type member of ViewModel class can return a function that is evaluated for every item in collection

This brings even more flexibility, the view model itself can decide which view to use based on a value returned from type function.

], function(ko) {

    ko.bindingHandlers.content = {

        'init': function (element, valueAccessor, allBindingsAccessor, viewModel, bindingContext) {
            var options = ko.utils.unwrapObservable(valueAccessor());

            var templateNameFunc = function(value, itemBindingContext)
                if('template' in options)
                    return options['template'];

                    throw new Error("Set 'type' of view model class.");

                var type = value.type;
                if(typeof type == 'function')
                    type = type();

                return type.replace(/viewmodel/ig, 'View');

            if ('foreach' in options) {
                var dataArray = (options['foreach']) || [];
                ko.renderTemplateForEach(templateNameFunc, dataArray, options, element, bindingContext);

            var templateName = templateNameFunc(,
                dataValue = ko.utils.unwrapObservable(options['data']);

            var innerBindingContext = bindingContext['createChildContext'](dataValue, options['as']);
            ko.renderTemplate(templateName || element, innerBindingContext, options, element);
contentBindingHandler.jsview rawview file on Bitbucket

The example will follow soon.