Unfortunately, we only have today a patchwork of this integration, still quite incomplete and far from what it could be, and this is hurting a lot our development process. We really need something brand new here: we have lots of inputs, usecases, and while it is of course not possible to cover every aspects of all build workflows, It is certainly possible to address most of the common issues we are facing today. Let's try to figure out where this could lead!
What is a platform?
Hey, looks like Wikipedia definition is quite good:
A computing platform is, in the most general sense, whatever pre-existing environment a piece of software is designed to run within, obeying its constraints, and making use of its facilities. Typical platforms include a hardware architecture, an operating system (OS), and runtime libraries.
So this could be:
- Targeting different CPU: like x86/x64/ARM...
- Targeting different OS: Windows Desktop, Windows Phone, Windows Store Apps, Android, iOS, XBoxOne OS, PS4...etc.
- Targeting other specific HW through an existing API (the runtime libraries of the Wikipedia's definition), like GPU through OpenGL, Direct3D, Metal...etc.
How do we target a platform in .NET?
Here is the short story. Our day life is of course a bit more complex.
For the CPU part:
Any CPU" is most of the time our time-saver (digression: why oh why "
Any CPU" must be defined with a space in the solution and expected to be "
AnyCPU" without a space in a xxproj?!)
- But when we have to use some external native code (dlls), we have to "DllImport" these functions. Problem is: native code comes with target CPUs, so
- Either the library we are using is on the OS. For example, Dllimport of Direct3D from a .NET application is transparent, as the OS is handling the x86/x64/ARM switch for us
- Or using a custom external native dll:
- Best case: We are lucky at being able to "LoadLibrary"(looking at you Windows RT/Store) to preload the x86/x64 dll, and then let the DllImport use the existing loaded dll
- Lazy/lame case: Patching the environment PATH variable (not always working)
- Worst case: We are forced to compile our application against x86/x64/ARM because the target platform doesn't support multiple CPU assemblies in the same package (doh! looking at you Windows RT/Store) or DllImport is not working (doh! Silverlight CLR on Windows Phone 8.0), even if 90% of our code could be AnyCPU and we just want a tiny dll function, we are good to compile/distribute 3 packages. That's our life...needless to say, painful.
For the OS and runtime libraries part:
- If we are developing a library and lucky at not using any OS specific APIs (looking at you,
FileStream, no longer portable because of the Windows RT/Store mess!), we can go with Portable Class Libraries (PCL). Of course, if we failed to compile to Any CPU, we are good for the next choice.
- If we are developing an application (an exe, a dll activity...etc.) or a non PCL-friendly library, we are good to compile against specific tool-chains (the little msbuild files imported at the end of our xxproj, remember?) and assemblies
- Use external assemblies, libraries, tools
- Most of the time by having an "external" or "deps" folders in our product repo, storing dlls for a specific version, or being able to recompile these dependencies from the sources from an internal repo. Care must be taken about versioning
- Potentially integrating them in our build process (UsingTask, pre-source process, post-exe process, ILMerge...etc.)
- Potentially using NuGet to get all-in-one packages
- If we do so, be ready to accept xxproj to be messed up by nuget, in several places (see next part) and prepare to suffer after a package update with our VCS...etc.
- Best case: we can build everything from a single solution (sln), and in some cross-platform cases, using the kind of tricks I described in my previous post.
- Worst case: we need to handle different solutions for different platforms. Sometimes requiring to develop custom tools to synchronize projects between platforms
- Depending on some defines, we could have different builds for the same platform (like debug with logs/release no logs... etc.)
- Potentially to develop custom msbuild targets and distribute them as part of our product
- Prepare to manage custom PowerShell and msbuild target files in NuGet package if you have anything platform specific (like x86) like in SharpDX.targets used by the nuget package.
So, we somewhat end-up with:
- Best-case: We have a single PCL library. Go back home from work, kiss your family.
- Social-case: We are publishing our PCL to nuget
- Worst-case:We have (multiple CPU to support) x (multiple OS/Store Rules) x (multiple platform specifics APIs) assemblies to
- develop (hey,
#ifdefwe still love you, you know)
- build (hey,
Condition="'$(XXX)'=='true'" is our friend, and oh, don't expect to avoid the msbuild's underground, msbuild is like our grandma, she still needs lots of love)
- deploy (hey... hm, ok, I gave up, too many options for a one-liner "hey")
Build packages vNext
As a preamble, a little note about NuGet. NuGet has been helping a lot in this area and is a super contribution in the develop/build/deploy chain, but NuGet has still to struggle with legacy builds, sometimes not NuGet fault, in particular:
- We are still referencing lots of assemblies through the regular "Add Reference..." because they don't have nugets
- NuGet is much more intrusive in the xxproj files than a simple "Add Reference": It has to store a relative paths (bad), and if the nuget package have target files, it needs to add some significant code to our xxproj (for example, in SharpDX)
- NuGet still needs to add references to our packaged assemblies, so if our package "Dummy" has 50 Dummy.ABC.*.dll assemblies, we will see a lot of them in our "References"
- NuGet doesn't have a probing path for looking for installed local assemblies, but needs to store the assembly references paths directly into the xxproj and forcing package storage (that can be configured in a nuget.config but still, no probing path). For example, if we move the project in a directory structure, it doesn't compile any more.
- NuGet is not VCS friendly. Updating a version of a package can cause *lots* of updates in our xxproj: prey that nobody else is doing the same thing on the same project on a different branch.
- PCL are good because they are surfacing the API, exposing a lightweight cross-platform core.
- We still need to live with platform specific assemblies
So, we can somewhat improve the process here by unifying the old and new in a Package vNext concept.
A Package vNext would be pretty much as the NuGet package we have today and would contain:
- A version number
- All meta descriptions found in NuGet (Owners, Project urls...etc)
- Dependencies to other packages/versions
- A set of assemblies, compiled for different platforms (or a single platform if it is really platform specific).
- Potentially a set of public properties/flags exposed by the package that could be set by the referencing project, and would allow to configure the way the link against this package (some specific assemblies or not...etc. depending on the platform...etc.)
- Potentially PDBs with direct source code included in the package (but unlike NuGet, not stored on a PDB Symbol server)
- Potentially documentation that would be automatically accessible from the IDE
- Potentially user files to add to the current project
- Potentially providing different additional build files (msbuild target files), transparently added to the build (but unlike today, not modifying the host msbuild files)
- Potentially an install plugin helper (like powershell, but I would prefer a .NET interface/plugin system instead of the unfriendly powershell syntax)
- Potentially providing IDE extensions (recognized by some IDE, that could provide specific IDE extension for VS or Xamarin Studio...etc.)
- Working also for C++ package: providing includes, libs...etc. (and here, C++ would gain a *lot*)
- Package could be signed (non-modifiable)
All our xxproj project (C#, VB, F#...etc.) would reference a package vNext (but usually not a path to package, though it could be possible in some cases), just like this:
<ItemGroup> <!-- Package loaded from probing paths --> <PackageReference Include=".NET" Version="4.0" /> <PackageReference Include="YourPackage" Version="1.0" /> <!-- Package loaded from probing paths but with the version defined at solution level --> <PackageReference Include="YourPackageSpecialVersion" /> <!-- Package loaded from specific path --> <PackageReference File="path/to/location/FixedLocalPackage-1.0.0" /> </ItemGroup> ... <Import Project="$(MSBuildToolsPath)\Microsoft.CommonvNext.targets" />
This is the only modification that would be required to reference a package. Everything else (target, custom tasks, files...etc.) would be automatically handled and integrated by the build system (here the CommonvNext.targets).
When we are targeting a platform specific application, or providing a PCL library, this should be only specified by some properties at the beginning of the project. We would not have to reference explicit targets/dlls in the project (currently, we need to have include CSharp.targets, or Xaml.targets, or WindowsPhone.targets...etc.) but handled by the build system.
The package version could be defined directly at:
- the xxproj project level
- the solution level, in order to avoid the multiplication of versions all around in all projects from a solution (like a sealed version that could not be override unless specified explicitely with an "override" attributes, exactly like in our languages)
- Provided by the system
- Override locally at the solution level
- Override locally at the project level
When compiling a project to target multiple platforms, the IDE should provide a way to easily identify which files is going to which platform from a xxxproj. This is a bit orthogonal to the Package vNext, but quite important to it if we want a project to target multiple platforms easily.
Packaging and publishing a Package vNext should be part of the build system, as for NuGet that is using nuspec files or directly xxproj files. It means that building a solution, or a project, would produce one or several package vNext directly consumable by other projects. A Package itself in a solution could contain one or several projects...etc. But a project would reference other packages, not projects.
Digression on implem of such a system with current msbuild system: One limitation of msbuild is that it cannot import a variable list of *.targets files, all this list must be known at compile time. But, a workaround would be that the build system would generate an intermediate build files (only used internally), exactly like it does for solution files (that are converted into a single msbuild files when building a solution).
With such a system, we would be able:
- To develop a cross platform application from a single solution, and even from a same project able to target multiple platforms
- To enhance the experience of working with libraries (core .NET framework, external libs...etc.) a unified system instead of having several systems/workarounds (add reference, target files, nuget packages)
- To reduce the changes/friction on xxproj, when we are switching package versions...etc. leading to much more VCS friendly build system
A build dream to build!
Ok, let's face it: This post is describing a "nice to have" concept. It is always easy to write this scratching article, but way more difficult to implement it! When looking at NuGet source code, we can see that it is *lots* of work to provide this kind of infrastructure.
Still, I believe that a full integration of the notion of package is a key direction for developing, building and deploying cross platform/platform-specific applications in .NET and we should embrace it at the core of our build system.
So what do you think about this? I'm sure there are lots of ideas that could come to improve all this concept, please, share it!