I was trying to get SpecFlow to build under Mac OS and had some problems with an MSBuild target (https://github.com/techtalk/SpecFlow/blob/master/Tests/TechTalk.SpecFlow.Specs/.build/build.targets#L32).
We need to replace some text in a file with a value which we only can access during build. For that we used the RoslynCodeTaskFactory. This has the benefit, of not having to write a whole MSBuild task. But I had problems that msbuild
or dotnet build
wasn’t able to find the Microsoft.Build.Tasks.Core.dll
.
So I decided to write a new MSBuild task anyway.
But this post is not about writing the n-th MSBuild task, but about that I was able to write it, create a NuGet package for it, have a continues build with automatic deployment to NuGet.org. And I was able to do it in less than 8 hours (and on a Mac).
You can find the whole sources here.
Writing the MSBuild task
Writing a MSBuild task is not hard. You need to reference Microsoft.Build.Framework
and Microsoft.Build.Utilities.Core
in your project. In your class you have to implement the Task
abstract class https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/src/MSBuild.AdditionalTasks/Tasks/ReplaceTextInFile/ReplaceTextInFileTask.cs#L8. In the Execute
- method you implement your task.
And done!
Creating a NuGet package for the MSBuild task to redistribute it
Thanks to the sdk- style project system, for most of the time when you create a NuGet package, a nuspec
- file is not needed anymore. Everything can be specified in the project file.
Packaging a MSBuild task is a little bit different than normal libraries. You have to put the assembly into the tasks
folder and not in the standard lib
folder. And then you need additional MSBuild files to register your task into MSBuild (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/tree/master/src/MSBuild.AdditionalTasks/build). Thanks to Nate McMaster who wrote a blog post about packaging MSBuild tasks (https://natemcmaster.com/blog/2017/07/05/msbuild-task-in-nuget/).
What has changed since he published his article, is that NuGet.org now wants a license included in the package. For that you need to specify in which file the license is (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/src/MSBuild.AdditionalTasks/MSBuild.AdditionalTasks.csproj#L22) and package the license file into your package (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/src/MSBuild.AdditionalTasks/MSBuild.AdditionalTasks.csproj#L35).
To make it easier in the build pipeline to put the generated NuGet package into the build artifacts, I configured the output folder for packages to be outside of the project (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/src/MSBuild.AdditionalTasks/MSBuild.AdditionalTasks.csproj#L31).
This makes it also easier to use the generated package in a test/sample project.
Additionally I enabled GeneratePackageOnBuild
so that the NuGet package is generated everytime the project is build (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/src/MSBuild.AdditionalTasks/MSBuild.AdditionalTasks.csproj#L30).
To check if everything was correct in the NuGet package, I uploaded it manually to NuGet.org. That’s were I got the notice that I have to add the license into the package. ;-)
Continue Build
I love Azure Pipelines. And since they provide 10 free unlimited hosted build agents for Open Source project (https://azure.microsoft.com/en-us/blog/announcing-azure-pipelines-with-unlimited-ci-cd-minutes-for-open-source/), it’s better than before (limit to 300 min per month). And they have hosted build agents for Windows and Mac OS.
You can use the web UI to define your build pipeline or you can do it per yaml (https://docs.microsoft.com/en-us/azure/devops/pipelines/get-started-yaml?view=azure-devops).
Yaml has the benefit, that you can define the same pipeline and run it as jobs on different agent pools.
Azure Pipeline definition: https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/azure-pipelines.yml
Job definition: https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/build.yml
Build status: https://sabotageandi.visualstudio.com/MSBuild.AdditionalTasks/_build?definitionId=6
The part where I struggled the most was to find the right path for the PublishBuildArtifact
task (https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/build.yml#L58). At the end the problem was, that I specified a file pattern. Looks like that doesn’t work with the task.
Deploy it to NuGet.org
With now a working build pipeline, I created a new release pipeline that takes the artifact and uploads the nupkgs- files to NuGet.org.
As of writing this article, only the UI is available to define release pipelines.
Release pipeline
Release pipeline detail
The “hardest” part was to create the service connection with NuGet.org. For that, you need an API key for your account. The steps for that are documented here.
When you have it, you can create a NuGet service connection in Azure Pipelines.
This service reference has to be used in the NuGet task. And after specifing the correct path where the NuGet packages are (yes, again pathes are hard).
And that’s it. At the second run of the release pipeline, a new version of NuGet package was uploaded to NuGet.org. Yeah!
Problems on the way
I wanted to have one way to build the task on Windows and Mac OS. I choose PowerShell, because it is available on Mac OS and Windows.
To build I used dotnet build
and so I don’t have to find msbuild
on my system (MacOS: in PATH, Windows: search via vswhere.exe). But because I build the sample/test in the same script, the NuGet- package gets cached in the local package cache. So I need to delete the cached files before the build.
Obvious dependend on the OS, there are different ways to get the folder. In this case, a simple check if an environment variable exists ($HOME for Mac OS)
https://github.com/SabotageAndi/MSBuild.AdditionalTasks/blob/master/build_and_tests.ps1#L7.
As we got the folder, removing it is easy, if you know PowerShell (I didn’t, now I do a little bit more).
Conclusion
It’s facinating how easy this stuff got in the last years. Some years ago doing this all was work of multiple days. Now I was able to do it in some hours. And the most part of the time I spend on the build PowerShell script.
So I think I should learn PowerShell. :-)
Yesterday I started a new Xamarin.Forms project in F# on my Mac. Because the NuGet experience isn’t the best in Visual Studio for Mac, I always use Paket for managing my dependencies.
Normally I am always following the steps here and here. But this time, after I have done everything, the project didn’t compile anymore.
After some time of fiddling with Paket (I am not the expert user of it) I found the solution. You have to add a framework restriction to the Xamarin.Forms and Xamarin.Android.FSharp.ResourceProvider entry.
The reason is, that if you not add it, on Android Paket is using MonoAndroid1.0 as target framework and so the dependencies are resolved not correctly. I found this GitHub issue for this behaviour: https://github.com/fsprojects/Paket/issues/2762
The complete paket.dependencies file looks like this:
source https://www.nuget.org/api/v2
nuget FSharp.Core
nuget Xamarin.Forms framework: MonoAndroid8.1,netstandard2.0,xamarinios
nuget Xamarin.Android.FSharp.ResourceProvider framework:monoandroid8.1
In preparation of continuing the work on .NET Core/Standard support for SpecFlow and recognising in talks at the MVP Summit, that it is not that simple to explain how SpecFlow works, I thought it would be good to start writing it done.
So as starting ponts, let’s have a look at the normal workflow, when you write a new Scenario?
- The user opens or creates a feature file, writes it Scenario in Gherkin and saves.
- A code-behind file is generated, which contains code for a test class of the configured unit test provider. This could be SpecFlow+Runner, xUnit, NUnit, MsTest or MbUnit.
- The user selects the line and presses F12.
- A dialog opens, which contains a skeleton for a binding method for this sentence.
- The user adds the skeleton binding code to a new or existing binding class
- The user implements the binding. Probably with some helper methods from the TechTalk.SpecFlow.Assists namespace.
- The user compiles its project
- The user runs the tests
- The tests are hopefully green.
That’s a lot of steps to get from a scenario to a executed tests.
And these are the parts that are involved in this:
- Generator
- Code- Behind File generation
- Plugins
- Runtime
- Binding lookup
- Assists
- Plugins
- Visual Studio Extension
- Syntax Highlighting/IntelliSense
- Navigation
- Code- Behind Generation
- Skeleton Generation
- ItemTemplates
More than you would think, or?
Additionally SpecFlow has also a console application (specflow.exe), which also contains crucial functionality:
- SpecFlow.exe
- generateAll Command
- MSBuild tasks
- Reports
To make it easier to read for you (and to write for me), every major part will get it’s own blog posts. So you don’t have to wait until I am completely finished with everything. I will add links to the individual posts when they are ready.
Sometimes after the .NET team announced that they will add global tools to .NET Core, I asked myself if it would be possible to combine them with Avalonia to make a global tool with an UI.
For those who don’t know Avalonia, you should check it out. It’s an UI framework that uses XAML like WPF and Xamarin Forms and is implemented for .NET Framework and .NET Core. It runs on Windows, Linux and Mac OS.
Back to the global tools.
TL;DR: You can use Avalonia together with global tools. Here is a Sample: https://github.com/SabotageAndi/LicenseActivatorPrototype
But what it’s needed? Not much.
You have to add the PackAsTool and the ToolCommandName property.
PackAsTool marks the project as a tool for installing. ToolCommandName specify how the tool is called after installation.
This sets all, that you only have to do a dotnet pack to generate a NuGet package for the global tool.
After that, you can install it with a dotnet install tool -g Your_Tool_Name. If you have packages from additional sources than NuGet.org, you can add them via command line argument or have a NuGet.config. In my prototype I am using nightlies of Avalonia and so I need to configure their feed as an additional source. Additionally I configured a local source, where the generated NuGet package gets published. It is here: https://github.com/SabotageAndi/LicenseActivatorPrototype/blob/master/NuGet.config
I had one small problem, where I got an error during installation, that there would be a downgrade of the System.Net.Primitives packages happen and so the tool can not be installed. The fix was to reference the package in project directly.
So what does this mean, that we could use Avalonia in a global tool?
We are not limited to only command line global tools, but can also write rich UI applications. And we can simply deploy them with a simple command line call.
I would call this cool. Let’s see what we will make out of it.
Thanks to Martin Ullrich (https://dasmulli.blog/2018/01/23/exploring-global-net-core-tools/) and Nate McMaster (https://www.natemcmaster.com/blog/2018/02/02/dotnet-global-tool/) for their blog posts about global tools.
During my preparation for my latest meetup talk, I noticed that I wasn’t able to switch to presenter mode in Visual Studio. The Quick launch tasks were simply not there.
What now? In the past they were part of the productivity power tools extension. This was installed, but no quick launch tasks for me. 🙁
After some research together with the audience, we found the reason. The presenter mode quick launch tasks were moved to a separate extension and this was not installed. The productivity power tools are now a meta extensions, that has dependencies on a lot of other extensions.
So after installing the extension and restarting VS I had the presenter mode back. So if you are searching it sometime in the future, the extension is called Quick Launch Tasks.
PS: I am sure, that the extension was missing is due to my development setup, multiple dev machines (4 machines at 3 locations) and extension roaming. So no bad feeling against the extension developer for this.
Last Monday (30. January) a colleague and I gave a talk about the basics of MSBuild. So that also others can enjoy my explanations, I wrote this post.
All examples can be found here:
Classic Hello World
<Project Sdk="Microsoft.NET.Sdk">
<Target Name="HelloWorld">
<Message Text="Hello World!" />
</Target>
</Project>
Command: msbuild %Filename%.csproj /t:HelloWorld
That’s the classic “Hello World” as for every programming language. And yes, MSBuild looks like XML, but it’s a programming language. “Target” defines a “method”, which can call other targets and tasks. Tasks are functions that are defined in assemblies.
With the /t parameter, you specific which target is executed. When you don’t specify it, a default target is executed, but in this case, we want that the “HelloWorld” target is executed and print “Hello World!”.
Variables
As functions/methods there are also variables in MSBuild. You define them in PropertyGroups.
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<HelloText>World</HelloText>
</PropertyGroup>
<Target Name="HelloWorld">
<Message Text="Hello $(HelloText)!" />
</Target>
</Project>
You can access them with $(%VariableName%).
Parameters
To get some data from outside into your MSBuild script, you can pass parameters with /p and access them as other variables.
<Project Sdk="Microsoft.NET.Sdk">
<Target Name="HelloWorld">
<Message Text="Hello $(HelloText)!" />
</Target>
</Project>
Command: msbuild HelloParameter.csproj /t:HelloWorld /p:HelloText=Internet
Ifs aka Conditions
And as in other languages, there are also Ifs available. Here are they called Conditions and
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<HelloText Condition="$(HelloText)==''">World</HelloText>
</PropertyGroup>
<Target Name="HelloWorld">
<Message Text="Hello $(HelloText)!" />
</Target>
</Project>
You can add the Condition element on nearly everything. It could be variables, whole property groups, targets or tasks.
This example uses conditions to set the variable “HelloText” to a default value if it wasn’t defined via parameter.
multiple Targets
Who wants to write everything in a single method? Nobody. But nothing stops you to write more targets.
<Project Sdk="Microsoft.NET.Sdk">
<Target Name="HelloWorld">
<CallTarget Targets="PrintHello"/>
<CallTarget Targets="PrintWorld"/>
<CallTarget Targets="PrintExclamationMark"/>
</Target>
<Target Name="PrintHello">
<Message Text="Hello" />
</Target>
<Target Name="PrintWorld">
<Message Text="World" />
</Target>
<Target Name="PrintExclamationMark">
<Message Text="!" />
</Target>
</Project>
With the “CallTarget” task you can execute other targets.
“Events”
Sometimes you can not add a “CallTarget” to get your own code executed in MSBuild. For that there is a feature, that you can also trigger your targets before or after another target.
<Project Sdk="Microsoft.NET.Sdk">
<Target Name="HelloWorld">
<Message Text="Hello World!" />
</Target>
<Target Name="Before" BeforeTargets="HelloWorld">
<Message Text="BeforeHelloWorld" />
</Target>
<Target Name="After" AfterTargets="HelloWorld">
<Message Text="AfterHelloWorld" />
</Target>
</Project>
With setting the Before/After- Targets elements you register your target to be executed before or after the target. Real nice feature to hang your targets into the existing lifecycle of compiling a project.
Files
Sometimes you need to execute a target for a list of files. This is possible with ItemGroups.
<Project Sdk="Microsoft.NET.Sdk">
<ItemGroup>
<Files Include="*.*" />
</ItemGroup>
<Target Name="HelloWorld"
Inputs="@(Files)"
Outputs="%(Identity).Dummy">
<Message Text="@(Files)" />
</Target>
</Project>
With ItemGroups you can define a variable that holds a list of files. You can include and exclude to this variable with additional entries in the ItemGroup.
To get a target executed per entry in a file variable, you have to use the Inputs and Outputs attributes. In difference to variables, you have to use @(…) and not $(…).
You have to set the Outputs attribute if you set the Inputs attribute. In this case, we can set it to %(Identity).Dummy, because we generate no output files.
The end of my part
So that was my part of the talk. My colleague Raoul Holzer continued with a little bit more advanced stuff. His examples can be found here: https://github.com/RaoulHolzer/MSBuild201
And as soon he has written his blog post, I will link it here.
A year ago I got out of nowhere following eMail:
I was awarded as a Microsoft MVP in the “Visual Studio and Development Technologies” Category. How cool is that? And how did this happen?
I had filled out a profile and entered my activities on https://mvp.microsoft.com/, but the feedback that I got months before the eMail, was that a little bit is missing and it needed a some more activities to become a MVP. But as it looks now, it was still enough. 😉
So what did I do now in the last year as a MVP?
First I continued my work on SpecFlow (http://www.specflow.org; https://www.github.com/techtalk/SpecFlow) not only in my dayjob, but also in my spare time.
Motivated through the award, I wanted to give more talks and bring the .NET community in Vienna more together. Together with my colleague Raoul Holzer, we did together following talks and events in the last 12 months:
And we have already planned some for the future:
Since the middle of December I am now also Co- Organizer of the .NET Community Austria meetup group and the F# |> Vienna meetup group and help Jörg and Andi with organising the meetups.
I have to thank my company TechTalk for supporting the meetups with a location, drinks and food.
Ok, that was the “active” stuff, that happened in the last year. But what happened else?
You get some really nice benefits when you are being a MVP and I really enjoy them. Some of them are from Microsoft, some are from other companies.
So you get an Office 365 subscription and a Visual Studio Subscription. From JetBrains you get a Resharper and Rider license. That makes developing really easy.
At the end of May/begin of June there was the MVP Community Connect in Madrid, which was really fun. There I had the change to meet some other MVPs from Austria, Spain, Portugal and Italy.
But the coolest benefit of being a MVP is the MVP Global Summit in Redmond where all MVPs of the world get together for some days and are able to get in touch with the product groups at Microsoft. As this will be my first Summit, I am really excited to attend it.
So if you are doing community work in the Microsoft technology stack or working on Open Source projects that has anything to do with Microsoft, go and nominate yourself for the MVP.
Hope to see you then at the next MVP Global Summit!
In SpecFlow+ we are supporting multiple different versions of SpecFlow (currently 1.9, 2.1 and 2.2) and so we have to test each version with the full set of our test suite.
As we don’t want to copy a lot of code and then change some references/package version, I was looking if it is without much pain possible, that you have multiple projects/csprojs in one folder and produce so multiple assemblies.
We are already having a solution with the old project system, but it involves a lot of linked files and manual editing of the project files. So with the new project system it was time to look at this again.
And after some work I had a solution to achieve this with the new csproj format. Here it is:
<Project>
<PropertyGroup>
<BaseIntermediateOutputPath>obj\$(MSBuildProjectName)</BaseIntermediateOutputPath>
</PropertyGroup>
<Import Project="Sdk.props" Sdk="Microsoft.NET.Sdk"/>
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<OutputPath>bin\$(Configuration)\$(TargetFramework)\$(MSBuildProjectName)</OutputPath>
</PropertyGroup>
<Import Project="Sdk.targets" Sdk="Microsoft.NET.Sdk"/>
<ItemGroup>
<Compile Remove="obj\\**\*"/>
<Compile Include="obj\$(MSBuildProjectName)\**\$(MSBuildProjectName).AssemblyInfo.cs"/>
</ItemGroup>
</Project>
So what does it in detail?
Yes, there is no Sdk attribute defined. How does this work?
<PropertyGroup>
<BaseIntermediateOutputPath>obj\$(MSBuildProjectName)</BaseIntermediateOutputPath>
</PropertyGroup>
We are adjusting the BaseIntermediateOutputPath, so that each project has it’s own and don’t replace each others file. This is also the reason, why we have no Sdk attribute set. This property has to be set before the Sdk.props file is evaluated.
<Import Project="Sdk.props" Sdk="Microsoft.NET.Sdk"/>
So, finally import the first part of the Sdk.
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<OutputPath>bin\$(Configuration)\$(TargetFramework)\$(MSBuildProjectName)</OutputPath>
</PropertyGroup>
The TargetFramework property is well known. Nothing to add about this here.
The OutputPath property is the location where the finished assemblies are copied. As BaseIntermediateOutputPath it has to be adjusted, so that each project has it’s own folder.
<Import Project="Sdk.targets" Sdk="Microsoft.NET.Sdk"/>
And now the second and last part of the Sdk is imported.
<ItemGroup>
<Compile Remove="obj\\**\*"/>
<Compile Include="obj\$(MSBuildProjectName)\**\$(MSBuildProjectName).AssemblyInfo.cs"/>
</ItemGroup>
This is one of the stranges parts of the file. The reason for this is, that during build the version infos from the csproj are written to an *.AssemblyInfo.cs file in the obj folder. Combined with the globbing in the new format, MSBuild now finds multiple *.AssemblyInfo.cs files (one for each project).
So we have to remove the whole obj subtree from the compiler and readd the AssemblyInfo.cs for our project to get the version infos into the assembly.
Everything clear now? 😉
And the best, Visual Studio 2017 can handle this files without problems.
You can find a complete example with two projects here: https://github.com/SabotageAndi/MultipleProjecsInSameFolder
Conclusion: I have to say, I am very happy with the new project system. Something like this wouldn’t be possible with the old one (or it would be a lot more painful).
Let’s see what nice hacks are also possible in the future.
During checking for a build error on Mac OS X of the Gherkin parser, I wanted to uninstall all my installed .NET Core SDKs, to have a clean slate.
In my search how to do this, I found this script in the .NET Core repo: https://github.com/dotnet/cli/blob/master/scripts/obtain/uninstall/dotnet-uninstall-pkgs.sh
Steps to use:
wget https://raw.githubusercontent.com/dotnet/cli/master/scripts/obtain/uninstall/dotnet-uninstall-pkgs.sh
chmod +x dotnet-uninstall-pkgs.sh
sudo ./dotnet-uninstall-pkgs.sh
After that, you have removed all .NET Core SDKs from your Mac and you can start from the beginning to install new ones. 😉
Two weeks ago in a project we had a XamlParseException when a special page was opened. Strange thing was, that we didn’t change anything of the XAML- Code, since the last working version. There were only some small changes in the code-behind of the page.
The error message was following:
Windows.UI.Xaml.Markup.XamlParseException: ‘The text associated with this error code could not be found. Failed to create a ‘UWPSystemTypeConverterTest.Converter.EnumTypeConverter’ from the text ‘enums:CustomEnum’. [Line: 14 Position: 56]’
Here is the code that produces this error.
Complete project can be found on https://github.com/SabotageAndi/UWPSystemTypeConverterTest
shortened XAML Page:
<Page
x:Class="UWPSystemTypeConverterTest.MainPage"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
xmlns:converter="using:UWPSystemTypeConverterTest.Converter"
xmlns:enums="using:UWPSystemTypeConverterTest.Enum"
mc:Ignorable="d">
<Page.Resources>
<converter:EnumTypeConverter x:Key="Converter" TypeToDisplay="enums:CustomEnum" />
</Page.Resources>
<Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">
<TextBlock Text="{Binding Converter={StaticResource Converter}}" />
</Grid>
</Page>
Converter:
internal class EnumTypeConverter : IValueConverter
{
public Type TypeToDisplay { get; set; }
public object Convert(object value, Type targetType, object parameter, string language)
{
return TypeToDisplay?.FullName;
}
public object ConvertBack(object value, Type targetType, object parameter, string language)
{
throw new NotImplementedException();
}
}
The interesting thing was, when we added a public property of the enum to the code-behind of the XAML page, it worked.
Code- Behind:
public sealed partial class MainPage : Page
{
public CustomEnum WithThisPropertyTheAppWorks { get; set; }
public MainPage()
{
InitializeComponent();
this.DataContext = this;
}
}
Reason why it worked:
After a StackOverflow post and some mails on a MVP mailing lists, I got an answer for the strange behaviour.
The XAML compiler and the runtime don’t support System.Type- typed properties. So the needed metadata is not generated and the runtime can not convert the string to the type.
But because of the public properties on the code-behind, the compiler generates the needed metadata now. I am not that happy with the work around, but it is better than other solutions (e.g. a string property with the fullname to the type).
I hope that in a future version of the UWP Xaml Compiler and runtime this will be addressed and will be no more issue.
But up to now, you know how to fix the error if you encounter it.