Common Libraries – How to avoid Premature Optimisation

Common Libraries can be a great way to get a new project started and productive quickly, particularly for teams, or companies with a broad set of services on their books.

They can also be the source of a great amount of friction when done too early.

Premature Optimisation is the Root of all Evil – Donald Knuth

Before diving in and creating a new “Common” project ask yourself a few questions:

  1. Do you understand the problem well enough to define an API appropriate for use across multiple projects?
    Without anecdotally requiring a solution to a problem across multiple projects the answer to this is no.  This is your starting point. Understand your requirements.
  2. Do you understand the solution to your problem well enough to create an implementation that’s genuinely fit for purpose, won’t change every time a new project comes along, and is appropriate to cover the use case(s)?
    Implement your solution at least twice alongside the problem.  First time through by nature isn’t shared, second time through, tackle it in isolation to the first.  Apply lessons learnt and refactor accordingly.  Don’t create common projects with only one consumer,  YAGNI (You aren’t gonna need it).
  3. Code should follow the DRY (Don’t Repeat Yourself) Principle where possible, but do you understand whether the Repetition is truly a Repetition, or is it coincidental?
    Microservice architectures aim to create isolated deployable units, be mindful of binding solutions together at points that shouldn’t be there through shared code artefacts.  Understand your service architecture and adhere to their interface.
  4. Do you have a clear way to deploy change to your common code?  Should your update be implicit or explicit?
    Internal Nuget Feeds are a great way to share code, less great ways could be SVN Externals or sharing DLL’s, wouldn’t advocate this but it does happen.  Understand what will work best for your environment.  Should your changes be rolled out across the board, or should they be opted in for through a new version of a package?
  5. Do you understand the impact of change on consumers?  Will “just” adding an extra enum value cause all of your other consumers to come grinding to a halt?  What will happen if you add an external dependency?
    A healthy set of unit tests should go some way to covering this.  Always be mindful of supporting the lowest common denominator and plan an integration strategy accordingly.  Nuget is great at managing version mismatches across dependencies, and takes away a lot of the pain, this doesn’t mean it shouldn’t be a consideration.

Common Libraries, when done well, can enhance efficiency.  They can also easily get out of hand.  All code is Legacy Code once it’s been committed, it’s crucial that shared components be maintained and understood by the team, both in their implementation and integration to the services they’re used in.

Have you ever found yourself slowed down by attempting to share code?  What issues have you experienced?

Local Project with Nuget Package References – System.IO.FileNotFoundException

One of the features introduced with .Net Standard Projects was the definition of Nuget Package Dependecies via Package Reference tags in the csproj file and the removal of the packages.config.  There are however some issues when referring to a local project via a Project Reference using the packages.config (.Net Framework app using the legacy csproj), where the referenced project has dependencies on a Nuget Package via a Package Reference.

System.IO.FileNotFoundException: 'Could not load file or assembly 'Newtonsoft.Json, Version=, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' or one of its dependencies. The system cannot find the file specified.'

The dependent DLLs aren’t where the Application expects them to be because they aren’t being referenced consistently throughout the application.


Nuget Package Reference

NugetPackageReference.Console – .Net Framework Application referencing local project NugetPackageReference.Dependent

NugetPackageReference.Dependent – .Net Standard Project referencing Nuget Package Newtonsoft.Json

DependentClass.cs – uses anything from Nuget Package

namespace NugetPackageReference.Dependent
    public class DependentClass
        public string NewtonsoftJsonNull()
            return Newtonsoft.Json.JsonConvert.Null;


Program.cs – calls method referring to Nuget Package

using NugetPackageReference.Dependent;

namespace NugetPackageReference.Console
    class Program
        static void Main(string[] args)
            var dependentClass = new DependentClass();

            var nullString = dependentClass.NewtonsoftJsonNull();

This will give you…

File Not Found Exception


Option One – Reference Package

Add a Nuget reference to the Root Project to the missing Package.

Option Two – Migrate Packages.config to PackageReference

There is a tool in Visual Studio to migrate package references to the csproj  file, however it doesn’t currently work for all Project Types, (Service Fabric applications using .Net Framework, WPF for example).  This method is preferable.  If your project falls into this category the only way I know of to resolve the issue is Option One.

Right Click the packages.config file and select Migrate packages.config to PackageReference…

Migrate packages.config to PackageReference

Having done this the root project will now resolve dependencies further down the tree through the PackageReferences.


Microsoft have given us a great way of migrating towards .Net Core while being able to retain business logic from existing .Net Framework applications.  The integration of these components can be complex however and initially, isn’t quite foolproof.

What issues have you faced migrating towards .Net Core?

Do you know of a better way to solve this for legacy projects?



I set myself a personal goal recently…

publish a Library on NuGet.  So that’s exactly what I’ve done.

SqlBulkCopyCat is a configurable wrapper around SqlBulkCopy in .NET for SQL Server, born out of a personal frustration at work of being pushed towards using SQL Server Integration Services (SSIS) for copying large amounts of data between databases with no real alternative for anything robust without the baggage of taking on SSIS dependencies and the associated bloat in the Development Environment.

Continue reading “SqlBulkCopyCat”