Roslyn powered live code analyzer

First there was a problematic WPF binding property, then I had to check all binding properties and then I thought about using FxCop to do that dirty job for me. But unfortunately FxCop is no longer developed and supported. That made me a little bit sad, since I really liked that tool and its power and usefulness.

But then I found this article by Alex Turner Use Roslyn to Write a Live Code Analyzer for Your API and after reading that I no longer mourned for FxCop. The new Roslyn powered while-you-type code analysis and error reporting is incredible!

And I decided to write my own DiagnosticAnalyzer rule. The rule I implemented is the good old CA2000: Dispose objects before losing scope. It is already there among other FxCop rules in the Microsoft.CodeAnalyzers.ManagedCodeAnalysis rule set. But I just wanted to try it and find out how difficult it is to create such rule.

First I installed required VSIX packages, then created project with given template, copied the code from MSDN page and started the investigation on how to check if the object is disposed after all references to the object are out of scope.

First I created a dummy class with both invalid and valid usages of a disposable objects.

This is the code of DummyClass class I used for testing (test project is generated using the VS template and added to the solution).

´╗┐using System.IO;
using System.Text;

class DummyClass
{
    private Stream _s;

    public DummyClass()
    {
        /* 
         * almost correct usage: 
         * value of field _s must be disposed later 
         * (maybe the rule can suggest to implement IDisposable interface) 
         */
        _s = Create();

        /* 
         * correct usage: 
         * assigning IDisposable inside using block to variables
         */
        using (Stream a = Create(), b = Create()) { }

        /* 
         * correct usage: 
         * assigning IDisposable inside using block to a previously declared variable 
         */
        Stream c;
        using (c = Create()) { }

        /* 
         * incorrect usage: 
         * not using using statement for declaration and initialization of a IDisposable variable 
         */
        var d = Create();

        /*
         * these lines were added just to prove that the rule is ignoring non-IDisposable variables
         */
        var sb = new StringBuilder(); // declaration and initialization of a non-IDisposable variable  
        StringBuilder sb2;
        sb2 = new StringBuilder(); // assigning non-IDisposable to a previously declared variable
    }

    Stream Create()
    {
        return null; // the real value is not important, return type is
    }

    public void Method()
    {
        /* 
         * incorrect usage: 
         * not using using statement for declaration and initialization of a IDisposable variable 
         */
        var stream = new MemoryStream();
    }
}

Note: I have found it very useful to keep the sample code in a separate compilable file and not in string variable in a test method. The advantage is that you know the code is valid and it is easier to locate the reported error (unless you’re debugging your project running sand-boxed Visual Studio).

Roslyn Syntax Visualizer helped me to identify the nodes I have to check, those are

  • VariableDeclaration – for example var a = DisposableObject();, note that there can be more then one variable being assigned
  • SimpleAssignmentExpression – for example a = DisposableObject();

Action has to be registered to trigger the analysis after the semantic analysis of those syntax nodes is completed

public override void Initialize(AnalysisContext context)
{
    context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.VariableDeclaration);
    context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.SimpleAssignmentExpression);
}

Note: I used one callback action for both syntax nodes, but you can register one for each node and make the code cleaner.

And the final result is here – the basic idea is to check the type of RHS node if it implements IDisposable and if it does then check if that is happening inside using block. With one exception when the value is assigned to class field.

´╗┐using System.Linq;
using System.Collections.Immutable;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp;
using Microsoft.CodeAnalysis.CSharp.Syntax;
using Microsoft.CodeAnalysis.Diagnostics;

namespace Dev5.CodeFix.Analyzers
{
    [DiagnosticAnalyzer(LanguageNames.CSharp)]
    public class DisposeObjectsBeforeLosingScopeRule : DiagnosticAnalyzer
    {
        public const string DiagnosticId = "DisposeObjectsBeforeLosingScopeRule";
        internal const string Title = "Dispose objects before losing scope";
        internal const string MessageFormat = "A local object of a IDisposable type is created but the object is not disposed before all references to the object are out of scope.";
        internal const string Category = "Reliability";
        internal static DiagnosticDescriptor Rule = new DiagnosticDescriptor(DiagnosticId, Title, MessageFormat, Category, DiagnosticSeverity.Error, isEnabledByDefault: true);

        public override ImmutableArray<DiagnosticDescriptor> SupportedDiagnostics
        {
            get { return ImmutableArray.Create(Rule); }
        }

        public override void Initialize(AnalysisContext context)
        {
            context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.VariableDeclaration);
            context.RegisterSyntaxNodeAction(AnalyzeNode, SyntaxKind.SimpleAssignmentExpression);
        }

        /// <summary>
        /// Gets type symbol for System.IDisposable interface.
        /// </summary>
        /// <param name="compilation"></param>
        /// <returns></returns>
        public static INamedTypeSymbol IDisposable(Compilation compilation)
        {
            return compilation.GetTypeByMetadataName("System.IDisposable");
        }

        /// <summary>
        /// Returns boolean value indicating if the <paramref name="typeInfo"/> implements System.IDisposable interface.
        /// </summary>
        /// <param name="typeInfo">TypeInfo to check</param>
        /// <param name="compilation"></param>
        /// <returns></returns>
        private static bool IsDisposable(TypeInfo typeInfo, Compilation compilation)
        {
            if(typeInfo.Type == null)
            {
                return false;
            }
            return !typeInfo.Type.IsValueType && typeInfo.Type.AllInterfaces.Any(i => i.Equals(IDisposable(compilation)));
        }

        private void AnalyzeNode(SyntaxNodeAnalysisContext context)
        {
            var semanticModel = context.SemanticModel;
            var compilation = context.SemanticModel.Compilation;
            // are we inside using block? i.e. is the Parent of current node UsingStatement
            var insideUsingStatement = context.Node.Parent is UsingStatementSyntax;

            var declaration = context.Node as VariableDeclarationSyntax;
            // variable declaration node
            if (declaration != null)
            {
                // more than one variable can be declared
                foreach (var declarator in declaration.Variables)
                {
                    var variable = declarator.Identifier;
                    var variableSymbol = semanticModel.GetDeclaredSymbol(declarator);
                    var eq = declarator.Initializer as EqualsValueClauseSyntax;
                    var varTypeInfo = semanticModel.GetTypeInfo(eq?.Value);
                    // non-disposable variable is declared or currently inside using block
                    if (!IsDisposable(varTypeInfo, compilation) || insideUsingStatement)
                    {
                        continue;
                    }

                    // report this
                    context.ReportDiagnostic(Diagnostic.Create(Rule, declarator.GetLocation()));
                }
                return;
            }

            var assignment = context.Node as AssignmentExpressionSyntax;
            if (assignment != null)
            {
                // does the type of the RHS node implement IDisposable?
                var typeInfo = semanticModel.GetTypeInfo(assignment.Right);                
                if (!IsDisposable(typeInfo, compilation))
                {
                    return;
                }

                var identifier = assignment.Left as IdentifierNameSyntax;
                var kind = semanticModel.GetSymbolInfo(identifier).Symbol;
                // assigning field value or currently inside using block
                if (kind?.Kind == SymbolKind.Field || insideUsingStatement)
                {
                    return;
                }

                // report this
                context.ReportDiagnostic(Diagnostic.Create(Rule, assignment.GetLocation()));
                return;
            }
        }
    }
}
DisposeObjectsBeforeLosingScopeRule.csview rawview file on GitHub

During the development of the analysis rule I found that the syntax visualizer is really really helpful (but if you have it installed it is good to Pin it for the sample code only, VS was quite sluggish when I switched tabs to the rule file). Google is also very helpful (as usual) since I was struggling to get the type and symbol infos.

But overall I am super excited about this functionality, writing the rules is not that difficult, the live analysis is pretty impressive and the possibilities are infinite!

The code is available on GitHub.

How Powershell helped me to solve the assembly conflict

In one of my test projects the tests suddenly started to fail. Which is a bad thing. What was was worse and strange was the reason why they were failing

System.TypeInitializationException : The type initializer for 'RMReportingPortalDataLayer.Strategies.ReportDeliveryStrategy' threw an exception.
----> System.IO.FileNotFoundException : Could not load file or assembly 'log4net, Version=1.2.11.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a' or one of its dependencies. The system cannot find the file specified.}}

First I checked the bin directory and the assembly was not there. Then I checked the build log and found this line

No way to resolve conflict between "log4net, Version=1.2.13.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a" and "log4net, Version=1.2.10.0, Culture=neutral, PublicKeyToken=692fbea5521e1304". Choosing "log4net, Version=1.2.13.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a" arbitrarily.

Ok, there is a conflict between incompatible versions of log4net assembly. But which of the assemblies referenced in my project are suspects? How to check all the referenced assemblies of references assemblies? It’s task nobody would like to perform manually. I like Powershell so I googled around and found this post and made few changes – build the hash and write the formatted output to console:

$hash = $references | Group-Object Name, Version -AsString -AsHashTable

$hash.GetEnumerator() | Sort-Object Name | % { 
  $key = $_.Key.ToString().Trim()
  $value = $_.Value
  Write-Host $key
  $s = [string]::join([Environment]::NewLine + '   * ', ($value | Select-Object -ExpandProperty Who | Get-Unique | Sort-Object Who))
  Write-Host '   *', $s
}

Which produces nice output and all is clearer now :)

log4net, 1.2.11.0
* Lib.Shared.Admin, Version=1.0.0.0, Culture=neutral, PublicKeyToken=3941ae83427745cf
* Lib.Shared, Version=10.0.0.0, Culture=neutral, PublicKeyToken=3941ae83427745cf
log4net, 1.2.13.0
* Lib.Common, Version=0.2.10.0, Culture=neutral, PublicKeyToken=null

I was referencing older version of Lib.Common library, so I updated it to latest version and problem was solved!

Note that the Powershell loads all the assemblies in the folder and keeps them loaded until the Powershell window is closed. Or you can spawn new Powershell process:

$command = [System.IO.File]::ReadAllText("path to dependencies.ps1")
$bytes = [System.Text.Encoding]::Unicode.GetBytes($command)
$encodedCommand = [Convert]::ToBase64String($bytes)
powershell -NoProfile -EncodedCommand $encodedCommand # |%{$_}

The script is not perfect and you have to change the path to bin directory …

TypeScript build system for Sublime Text 2

Recently I decided to create a small Node.JS project. And I like strongly typed languages. I know I can’t get the full strong typing in JavaScript and having the power of it. So I decided to use TypeScript for the project. I chose Sublime Text 2 as an editor, I am used to use Visual Studio but in I wanted to take a grip of an editor that as I read “everyone loves”.

After downloading all the required bits I got to build (Ctrl+B) the project and ran into this error:

error TS5037: Cannot compile external modules unless the '--module' flag is provided.

This is easy to fix when you are building TypeScript project from command line:

tsc --module commonjs app.ts

This is the original typescript.sublime-build file I downloaded from the Internet:

{
  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "$file"],
  "file_regex": "(.*\\.ts?)\\s\\(([0-9]+)\\,([0-9]+)\\)\\:\\s(...*?)$",
  "selector": "app.ts"
}

and the fix is very simple (but it took me quite a while to figure it out) – just add new items ("--module" and "commonjs") to cmd array:

{
  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "--module", "commonjs", "$file"],
  "file_regex": "(.*\\.ts?)\\s*\\(([0-9]+)\\,([0-9]+)\\)",
  "selector": "app.ts"
}

I had to modify the file_regex too, because it was not working.

How to Use SVG Glyphs in XAML

Using SVG glyphs on web pages for custom icons is brilliant (see Font Awesome, GlyphIcons, …). The SVG glyphs are encoded using path expression to describe the vector graphics.

But wait! You can use the path expression to define a graphic element in XAML (which means WPF, Silverlight, WindowsPhone) too!

If you want to use the SVG glyphs in XAML here is a simple manual:

  1. download the font
  2. find the icon you want use (for example FontAwesome Suitcase icon  ) and its content value ("\f0f2")
  3. find the corresponding glyph in svg file

    <glyph unicode="&#xf0f2;" horiz-adv-x="1792" d="M640 1152h512v128h-512v-128zM288 1152v-1280h-64q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h64zM1408 1152v-1280h-1024v1280h128v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h128zM1792 928v-832q0 -92 -66 -158t-158 -66h-64v1280h64q92 0 158 -66 t66 -158z" />

  4. add new GeometryGroup with child PathGeometry to Resources in the XAML, set the value of PathGeometry.Figures property to d value of the glyph

    <GeometryGroup x:Key="Suitcase">
    <PathGeometry Figures="M640 1152h512v128h-512v-128zM288 1152v-1280h-64q-92 0 -158 66t-66 158v832q0 92 66 158t158 66h64zM1408 1152v-1280h-1024v1280h128v160q0 40 28 68t68 28h576q40 0 68 -28t28 -68v-160h128zM1792 928v-832q0 -92 -66 -158t-158 -66h-64v1280h64q92 0 158 -66 t66 -158z" />
    </GeometryGroup>

  5. Create DrawingImage. The geometry drawing must be turned upside down (see line 13). I googled around, but didn’t find out why.

     <DrawingImage x:Key="SuitcaseImage"> 
    <DrawingImage.Drawing> 
    <DrawingGroup> 
    <DrawingGroup.Children> 
    <GeometryDrawing Geometry="{StaticResource SuitcasePath}"> 
    <GeometryDrawing.Pen> 
    <Pen Thickness="5" Brush="Black" /> 
    </GeometryDrawing.Pen> 
    </GeometryDrawing> 
    </DrawingGroup.Children> 
    <DrawingGroup.Transform> 
    <TransformGroup> 
    <ScaleTransform ScaleY="-1" /> 
    </TransformGroup> 
    </DrawingGroup.Transform> 
    </DrawingGroup> 
    </DrawingImage.Drawing> 
    </DrawingImage> 

  6. use the DrawingImage as a source for an Image

    <Image Source="{StaticResource SuitcaseImage}" Width="150" Height="150" />

This is very simple approach how to use the glyphs in XAML. It does not allow you to easily reuse the icon with different colors (but you can change the line brush color on line 7 or set the Path.Fill brush) or it does not allow easy composition of icons (but you can add more GeometryDrawing to DrawingGroup.Children collection on line 5).

The complete code is available here, the result in Kaxaml editor: suitcase_screen

Building solution with multiple NuGet package projects on TeamCity

Let’s say you have a solution with projects in it and some projects represents NuGet packages. I know it is not ideal, but the NuGet package project are really small and I didn’t want to create solution for each individual project. And also they are share some code. And I didn’t want to publish all the packages when I made changes to one project only not affecting the others

A good example is a solution with UI controls – each project representing a group of UI elements or a single element. Like set controls from a big controls vendor – I imagine the solution is quite big with projects like grid control, map control, charting, etc. By adding a NuGet package projects you can keep these distributed separately – one package for grid (ComponentVendor.UI.Grid), one package for maps (ComponentVendor.UI.Maps), etc. End-users do not have to download one big massive package (ComponentVendor.UI) which will make them to delete the unused references later.

The solution structure

UI
  SharedCode
  Maps
  Maps.NuGet (references Maps)
  Grid
  Grid.Nuget (references Grid)
  ...

The first step was to create a TeamCity project for the solution and build configurations for:

  • root – build the entire solution (that’s why you set up the CI build) with VCS root pointing to source control root of the solution and with UI.sln as build file
  • all packages – manually triggered distribution all the packages at once
  • each NuGet package project

Setting up the build configuration for NuGet package project:

build steps

  • create new VCS root that points to the project source control directory to “isolate” this project from changes made in other projects
  • add VCS root checkout rule: +:.=>ProjectDirectory, this will checkout the sources into this directory NuGet build step
  • set MSBuild build file to .csproj file
  • add build step to create NuGet package with .csproj file as specification file and ProjectDirectory as base directory NuGet build step

The build and publication of the package is triggered by changes made to package project, for example by changing the package version.

Sharing NuGet repository across multiple solutions II

Second step: Fixing the HintPath of referenced assemblies

All distributed assemblies from the NuGet package are added as references to the project. NuGet sets the HintPath of the reference with relative path (relative to the packages directory). When using a shared repository as described in previous post this brings some problems.

Problem with CI builds

The shared repository is stored somewhere on your machine and all project references added by NuGet should point there. You usually don’t clone your development machine directory structure on your CI server. The CI build will fail because the assemblies were not found (but they are there in the directory).

Problem with package restore

Even NuGet package restore feature won’t help because it only downloads the missing packages and places them by default under packages directory in solution directory. But it does not change the HintPaths of referenced assemblies. You can’t event set the MSBuild project property AdditionalLibPath because you don’t know all the directories where the package DLLs are placed beforehand (of course I can create a build task).

The solution to this is very easy, I found this discussion where a build property is used in HintPath. I had to extend it a little to support both local and CI builds and turn the package restore on.

<PropertyGroup>
 <PackagesDirectory>path to shared packages directory</PackagesDirectory>
</PropertyGroup>
<PropertyGroup Condition="Exists('$(PackagesDirectory)')">
  <NuGetPackagesDirectory>$(PackagesDirectory)</NuGetPackagesDirectory>
</PropertyGroup>
<PropertyGroup Condition="!Exists('$(PackagesDirectory)')">
  <NuGetPackagesDirectory>$(SolutionDir)\packages</NuGetPackagesDirectory>
</PropertyGroup>

This setting covers both features – local development and CI build. Value of NuGetPackagesDirectory property will be set to the path to the shared directory (verified by Condition="Exists('$(PackagesDirectory)')") on local machine. On CI server this directory might or might not exist. If it does not exist it will be set to a packages directory under solution folder (used by package restore).

And the reference to NuGet package DLL

<Reference Include="NLog">
  <HintPath>$(NuGetPackagesDirectory)\NLog.2.0.1.2\lib\net40\NLog.dll</HintPath>
</Reference>

The branch and changset details are available here.

Sharing NuGet repository across multiple solutions

I’ll start with be summary BenPhegan posted to a thread I started on NuGet CodePlex site.

  1. Developers are historically used to having a local common library location for all their binary dependencies (ie all hintpaths reference “d:\common” or something similar).
  2. Often this library is controlled centrally and xcopy/robocopied to all developers machines to provide commonality of build capability, generally the central copy come from a CI build system.
  3. There are a lot of different projects that a developer works on with a lot of dependencies, and it is seen as efficient to have a single copy of these dependencies on disk.
  4. Project files are included from multiple solutions at arbitrary locations on disk, resulting in mismatched relative hintpaths when using NuGet.

The problem (not the only one) with shared packages directory is that every project that ever loaded a NuGet package creates a record in repositories.config file. You end up with this file containing lots of these. Also some packages directories might get deleted when a package is removed.

First step: Solution-level repositories.config file

My first approach to an “enterprise” NuGet was to get rid of the repositories.config file that is located under packages directory. All the packages.config files can be discovered by iterating through all projects in the solution:

const string fileName = "packages.config";
foreach (var project in solutionManager.GetProjects())
{
  var projectItem = project.FindProjectItem(fileName);
  if(projectItem == null)
  { 
    continue;
  }

  // we have the packages.config file here
}

But then I thought about it and got the idea that I can make repositories.config file local to a solution, i.e. put it in solution scope. And the best place to put NuGet specific file is the .nuget directory. The entries in config file are added with path relative to solution folder.

Solution
  .nuget
     nuget.config
     nuget.exe
     nuget.targets
     repositories.config
  ProjectA
  ProjectB
  packages.config

Now there’s one repositories.config file for every solution containing the records for every available packages.config. Package files in shared directory can be still deleted when the package in repository is no longer needed (this feature can be turned off and shared repository must be cleared manually).

Vote for it on NuGet!

I tried to avoid big changes in NuGet codebase and I was slowly getting to understand how NuGet internals works. Next step was to force NuGet to use the shared packages repository. I changed some classes (get the settings from registry) to find out the only thing I have to do was to add a node to C:\Users\(your name)\AppData\Roaming\NuGet\NuGet.config configuration file. Don’t forget to remove any repositoryPath from a Nuget.config file under your solution directory if you have one.

<config>
  <add key="repositoryPath" value="path to your repository"/>
</config>

The branch and changset details can be viewed on CodePlex here.

The description of how to build (locally and CI) a solution with project(s) referencing assemblies in a shared local directory will follow.

Data-driven Knockout views II

Last time I posted a code snippet with description of data driven Knockout views. I have modified and simplified it and

  • added support for forEach binding and
  • the type member of ViewModel class can return a function that is evaluated for every item in collection

This brings even more flexibility, the view model itself can decide which view to use based on a value returned from type function.

define([
    'knockout'
], function(ko) {

    ko.bindingHandlers.content = {

        'init': function (element, valueAccessor, allBindingsAccessor, viewModel, bindingContext) {
            var options = ko.utils.unwrapObservable(valueAccessor());

            var templateNameFunc = function(value, itemBindingContext)
            {
                if('template' in options)
                    return options['template'];

                if(!value['type'])
                {
                    throw new Error("Set 'type' of view model class.");
                }

                var type = value.type;
                if(typeof type == 'function')
                    type = type();

                return type.replace(/viewmodel/ig, 'View');
            };

            if ('foreach' in options) {
                var dataArray = (options['foreach']) || [];
                ko.renderTemplateForEach(templateNameFunc, dataArray, options, element, bindingContext);
                return;
            }

            var templateName = templateNameFunc(options.data),
                dataValue = ko.utils.unwrapObservable(options['data']);

            var innerBindingContext = bindingContext['createChildContext'](dataValue, options['as']);
            ko.renderTemplate(templateName || element, innerBindingContext, options, element);
        }
    };
});
contentBindingHandler.jsview rawview file on Bitbucket

The example will follow soon.

Data-driven Knockout views

I’m using Caliburn.Micro in my Silverlight and WPF projects. I fell in love with Caliburn automagic – view locator, auto binding, etc. For my NodeJS project I was looking for JavaScript library supporting MVVM pattern and I found KnockoutJS.

Knockout has very nice feature called binding with a simple way of implementing your custom binding.

What I was missing was Caliburn’s binding/convention for ContentControl – data driven view rendering:

<!-- DataContext is RootViewModel -->
<ContentControl Name="ChildViewModel" />

with the view located for value of ParentViewModel.ViewModel property and then rendered.

The solution I come with uses

  • Infuser as a templating engine, more on using Infuser as templating engine for Knockout here
  • TrafficCop used by Infuser and prevents duplicate AJAX requests

Infuser is configured to load the view templates from views directory (I will implement some view locator later):

infuser.defaults.templateSuffix = ".tmpl.html";
infuser.defaults.templateUrl = "/js/app/views/"; 

and every view model object must have a property returning its type name (if someone knows about better way how to do it, please let me know).

var SampleViewModel = kb.ViewModel.extend({
  constructor: function(model) {
    ...
    this.type = 'SampleViewModel';
  }, 
  ...
}

Code of the binding handler:

ko.bindingHandlers.content = {
  'init': function (element, valueAccessor, allBindingsAccessor, viewModel, bindingContext) {
    var options = ko.utils.unwrapObservable(valueAccessor())
    var templateName = options.data.type.replace(/viewmodel/ig, 'View'),
      dataValue = ko.utils.unwrapObservable(options['data']);
    var innerBindingContext = bindingContext['createChildContext'](dataValue, options['as']);
    ko.renderTemplate(templateName || element, innerBindingContext, options, element);
  }
};

and the HTML

<div data-bind="content: { data: childViewModel }"></div>
<script>
  var rootViewModel = kb.ViewModel.extend({
    constructor: function(model, options) {
      this.childViewModel = new SampleViewModel();
    }
  });
  ko.applyBindings(rootViewModel);
</script>

NuGet pack error “Assembly outside lib folder”

When I was working on a NuGet package I noticed that some DLLs (that were not supposed to be placed there) were copied into the content subdirectory of the package.

Added file 'content\lib\Microsoft.Practices.EnterpriseLibrary.Common.dll'.

And that also caused an issue:

Issue: Assembly outside lib folder.
Description: The assembly 'content\lib\Microsoft.Practices.EnterpriseLibrary.Common.dll' is not inside the 'lib' folder and hence it won't be added as reference when the package is installed into a project.
Solution: Move it into the 'lib' folder if it should be referenced.

It took me some time to figure out that NuGet copies all project items with BuildType set to Content to content directory in package. Problem was solved by changing BuildType to None.