SpecFlow 2.2 and SpecFlow+ 1.6 finally released

Finally after 13 months, SpecFlow 2.2 and SpecFlow+ 1.6 (Runner and Excel) were finally released yesterday.
Here are the official release notes: http://specflow.org/category/release-notes.

Two changes make me very happy.

The first is the removal of the MSBuild dependency. This was always a pain, as MSBuild sometimes used a cached version of your current project which was not actual. So if you added files to your project, it could be that the code-behind-file generator does not know about it. Especially in SpecFlow+Excel was this often a reason for problems.

The second is, that you can get now the FeatureContext as parameter of your Before/After- Feature hooks. With that, you do not need any more FeatureContext.Current (which has limitation in parallel execution scenarios).

I am also very happy, that finally got time to add support for the XUnit ITestOutputHelper (https://github.com/techtalk/SpecFlow/pull/874). Thanks to Pushkar Apte for it!

So, lets start work on the next release. Will it be 2.3 or 3.0? Lets see what happens. 😉

Using SpecFlow+Runners Parallelization Features

To start a parallel test run, you simply need to change the testThreadCount property in your srProfile to a number higher than 1. How your tests are executed then depends on the testThreadIsolation property.

The upcoming release of SpecFlow+Runner 1.4 adds an additional mode (SharedAppDomain), so I thought it time to explain this new option and the two existing ones (this option is already available in the pre-release version of 1.4 on GitHub).

The three supported modes are now:

  • AppDomain
  • Process
  • SharedAppDomain (new)

AppDomain

This is the default mode. Each test thread is executed in a separate AppDomain. These AppDomains are created at the beginning of the test run, and are reused for the rest of the test run.

Pros

Executed tests are isolated by the AppDomain border, so you do not have problems with static data.

Cons

Limited when you have shared data on a process level (e.g. SQLite in-memory dbs)

Process

This mode has been supported since at least version 1.2 (I don’t know the exact version). A separate executor process is created for each test thread and is used to execute the tests. This is necessary if your application contains entities that exists once per process, e.g. SQLite’s in-memory database. These processes are started at the beginning of the test run, and are reused for the rest of the test run.

This mode is also used if you run your tests using the .NET 2.0 framework or for a different processor architecture.

To keep your test run short, I would recommend settings testThreadCount to (CPU Cores – 1). The remaining core is then kept free for the actual test runner process to manage the other executor processes.

Pros

Completely process-based separation of executed tests

Cons

Slower due to the additional cost of starting the test execution processes and inter-process communication

SharedAppDomain

This new mode takes advantage of the new parallelization support in SpecFlow 2.0, and executes all tests in the same AppDomain. This makes it very fast, but the trade-off is that you lose the isolation between the currently executed tests. However if you have tests that do not require isolation, this is the fastest way to execute them.

When using this mode, you can set testThreadCount to really high numbers and still have fast test runs.

Pros

Very fast

Cons

No isolation between currently executed tests

Impressions of using F# (with a little bit of Xamarin.Forms)

In the last days I completed the first version of FFRAB-Mobile (see here). I used F# and Xamarin.Forms to gain more experience in a bigger project than a simple example app. You can find the sources here: https://github.com/SabotageAndi/ffrab-mobile

Here are my impressions:

Writing stateless could be hard, but it is worth the trouble

Coming from a C# background, now writing code that is near complete stateless is new and unusual. So the first code I wrote was a lot of object oriented stuff with too much access to shared state. Especially the access to the database connection was at the beginning a little bit painful.

But after some refactoring it got more and more in a functional style. At least I hope so.

Before refactoring:

https://github.com/SabotageAndi/ffrab-mobile/blob/0e72e26e83f5112e6c95cc99959baa9a55eb9882/ffrab.common/model.fs Line 52 – 140

After refactoring:

https://github.com/SabotageAndi/ffrab-mobile/blob/master/ffrab.common/database.fs https://github.com/SabotageAndi/ffrab-mobile/blob/master/ffrab.common/queries.fs

At the end it was worth the trouble. It is more readable and simpler code.

If it looks ugly, make a function

When you have pure F# code and libraries, it is complete in your hands to write nice code. But when you are using C# libraries, i can quickly get ugly because of the library. This is because the parameters of the C# methods are represented as tuples. With them, you have a lot of brackets and can not use the forward pipe operator ‘|>’.

But simply wrap the C# method call in a small function and voila.

Here is an example when using NodaTime (the “let startTime”- line is the important one): Direct call to C#:

let dateTimeFormat = OffsetDateTimePattern.CreateWithInvariantCulture("yyyy'-'MM'-'dd'T'HH':'mm':'sso<G>")
let startTime = common.Formatting.dateTimeFormat.Parse(dayNode.["day_start"].Value<string>()).Value

With F# wrapper function:

let parseNodaTime<'T> (pattern : NodaTime.Text.IPattern<'T>) rawValue =
        let result = pattern.Parse(rawValue)
        result.Value

let dateTimeFormat = OffsetDateTimePattern.CreateWithInvariantCulture("yyyy'-'MM'-'dd'T'HH':'mm':'sso<G>")
let startTime = (json.GetProperty "day_start").AsString() |> parseNodaTime common.Formatting.dateTimeFormat

For one usage it might be not that bad, but when you have multiple calls, it is more readable. Particularly you get the nice left- to- right readability back!

Left- to- right readability combined with railway oriented programming (ROP) rocks!

Look at this example:

let synchronizeData conference =
            match conference with
            | Some conference ->
                conference 
                |> checkForTimeout
                |> getDataLocation
                |> fetchJson 
                |> Parser.parseJson conference
                |> Synchronization.sync conference
            | _ ->
                ignore()
            conference

All parts of the synchronization are nice one after one step easy readable and extendable. So adding the additional timeout check was easy to add to this pipeline.

MVVM is more fun with FSharp.ViewModule than with C#

With FSharp.ViewModule and F# you have so much less boilerplate code in your viewmodels. Without the curly brackets of C# you can reduce the line count again.

So the 8 viewmodels are about 260 lines (with empty lines and boilder plate functions/type). In C# 3 of the viewmodels would have this code size.

Tooling? Enough, but some rough edges

When you code in Visual Studio, you need the Visual F# Power Tools. And the first thing you have to do is to enable Folder organization. You do not need the folders for your F# code, but to organize other files like images or json-Files. No idea why it is disabled by default.

On the Xamarin side there is sadly one bug, that prevents to use Xamarin.Forms on Android with F#. It looks like the F# CodeDom does not escape F# keywords. For details is here the Bugzilla entry: https://bugzilla.xamarin.com/show_bug.cgi?id=24709 Hopefully the bug is fixed soon.

For the C# developers with Resharper: First you think you will miss it, but you do not need it. The Power Tools have a Rename function. Other refactoring features like extracting or adding namespaces I didn’t miss.

So for me is F# a really fun and productive language. Not everything is easy and clear when you are using it the first time, but the fog will lift with the time. When you are working some days concentrated with F#, you will learn and understand a lot. And it will be awesome. 😉

Tips for starting develop in F# in Atom

This blog post is outdated!

Use VS Code and Ionide.

After some days working with Atom to write in F# code, here are some tips:

Read the manual and use the ionide-installer package 😉

On my first try I installed some ionide packages and so some features where missing. So really read the manual and use the ionide-installer package.

On Mac OS X – use the Mono package and not the one from Homebrew

When you are using the Homebrew version, Ionide can not spawn the background service for the autocompletion. If you use the Mono one from http://www.mono-project.com/download/ everything works. Additional this package includes the reference assemblies for Portable Class Libraries. Without them you can not compile a PCL.

Additional packages

git-plus: https://atom.io/packages/git-plus – git commands available in command palette terminal-plus: https://atom.io/packages/terminal-plus – terminal within Atom; when you use a german keyboard, you have to adjust the key mappings for toggle minimal: https://atom.io/packages/minimap – file preview on the right side

using Yeoman and the generator-fsharp

Yeoman with the F#- Generator (https://github.com/fsprojects/generator-fsharp) is a nice tool for creating new projects from templates. I added some new features in the last view days:

add FAKE support (https://github.com/fsprojects/generator-fsharp/pull/26) add references to other projects (https://github.com/fsprojects/generator-fsharp/pull/26) add template for PCLs with Profile 259 (https://github.com/fsprojects/generator-fsharp/pull/25) Hopefully they will be merged in the next days.

Using NodaTime in SQLite.Net

For a project I wanted to try out NodaTime, because of its nice handling with timezones. In this project I have also to save the values into a SQLite database and for that I normally use SQLite.net for that. Normally SQlite.net does not know how to handle the NodaTime types, but there are 2 places to add your own support for types.

ExtraTypeMapping

You can specific for your types which SQLite type it should be. That could be INTEGER, REAL, TEXT or BLOB. It’s a simply Dictionary<System.Type, String> you provide as constructor parameter, where you set the SQLite.net Type to BLOB.

IBlobSerializer

When you save your values as Blob, you can use the BlobSerializerDelegate of SQLite.net and provide the appropriate delegates to serialize and deserialize your types.

When you combine both, you can simply save your own types as blob in your database.

The code for saving LocalDate, OffsetDateTime and Duration you can find here: https://gist.github.com/SabotageAndi/194aa01e1c78cb614c1b.

It is in F#, but it is understandable by C#-only developers. 😉

On the Bleeding Edge – Akka.Net + Suave

So after playing with Suave.io, I wanted to add some new bleeding edge stuff to it. 😉 And run it on a beta environment, the DNX 🙂

First, the complete source of it is here: https://github.com/SabotageAndi/SuaveAkkaCore

Thanks to Alxandr there is F# support for the DNX: https://github.com/YoloDev/YoloDev.Dnx.FSharp

So here is a quick go through of the code to get the stuff working:

1. Add needed NuGet feeds

YoloDev: https://www.myget.org/F/yolodev/api/v2″ AspNetVNext: https://www.myget.org/F/aspnetvnext/api/v2″

2. Adapt project.json for F# Support

{
    “version”: “1.0.0-beta-*”,
    “dependencies”: {
    “YoloDev.Dnx.FSharp”: { “type”: “build”, “version”: “1.0.0-beta-*” },
    “Suave”: 0.29.1”,
            “Akka”: 1.0.3”,
            “Akka.FSharp”: 1.0.3
    },
    “frameworks”: {
        “dnx451”: {
            “frameworkAssemblies”: {
                “System.Runtime”: “”,
                “System.Threading.Tasks”: “”
                }
            }
        },
    “compiler”: {
        “name”: “F#”,
            “compilerAssembly”: “YoloDev.Dnx.FSharp”,
            “compilerType”: “YoloDev.Dnx.FSharp.FSharpProjectCompiler”
    }
}

The bold text are the changes needed for the F# support, the italic one are the needed changes for Akka.net and Suave.io.

3. initialize Akka.net & Suave.io

member x.Main () =
   use akkaSystem = System.create "SuaveAkkaCore" (Configuration.defaultConfig()) //1
  
   spawn akkaSystem "root" (actorOf2 handleRequest) |> ignore //2
  
   let cts = new CancellationTokenSource()
    
   let startingServer, shutdownServer = startWebServerAsync defaultConfig (app akkaSystem) //3
   Async.Start(shutdownServer, cts.Token)
    
   startingServer |> Async.RunSynchronously |> printfn "started: %A"
 
   printfn "Press Enter to stop"
   Console.Read() |> ignore
 
    
   cts.Cancel()

on //1 the Akka System is created with a standard configuration. It is named “SuaveAkkaCore”. This is later important to find the actor again. The only one actor (named root) in this example is spawnd on //2. When the actor gets a message, the handleRequest function is called. Last part on initializing the system is on //3 the start of suave. Here is only one webpart involved.

4. Suave webpart

let app system : WebPart = 
     fun (httpContext : HttpContext) ->
         async {
             let response = sendRequestToActor system httpContext
             return! response
         }

It’s a simple web part that reacts on every request and calls the sendRequestToActor method with the current Akka system and HttpContext (there is the request, response stuff in Suave).

5. sending message to actor

let sendRequestToActor system (httpContext : HttpContext) =
     
    let callActor = async { 
      let actor = select "akka://SuaveAkkaCore/user/root" system
      let! resp = actor <? httpContext
      return Some resp
    }
    
    let response = callActor |> Async.RunSynchronously
    response httpContext   

In the first part, the root- Actor is selected. Here is the system name and actor name needed. With the <? operator, the actor is asked with the HttpContext. After that, we run the async workflow and return the response.

6. the actor function itself

let handleRequest (mailbox : Actor<'a>) (msg : HttpContext) =
   let url = msg.request.url.ToString()                                     
   mailbox.Sender() <! (OK url)

In the actor function, we get the mailbox and the message as parameters. Here I simply take the request url and return it as OK response.

To run the stuff, you need the DNX runtime installed. Switch to the mono runtime for dnx with “dnvm use 1.0.0-beta6-12120 -p -r mono” With a “dnu restore” you get the needed packages. After that, a “dnx . run” starts the program.

Now browse to http://localhost:8083 and you see as response the browsed url.

Have fun!

Playing With AngularJS and Suave

After I found Suave.IO (lightweight HTTP Server in F#) I wanted to try it out. As I also wanted to play with AngularJS (I know I am a little bit late, but normally I get out of the way of web development). So why not combine both and create a small webserver that hosts the web app and the webservices.

As example for the AngularJS app, I used the PhoneCat Tutorial App from AngularJS.

You find the complete source on GitHub here: AngularSuave Repository

Explanation step-by-step:

let mimeTypes =
    defaultMimeTypesMap
        >=> (function | ".json" -> mkMimeType "application/json" true | _ -> None)

In the default mime types from Suave is no entry for json- Files. So I simply add it.

let rootPath = Path.GetFullPath "../../../Web"

The source files for the AngularJS are located in a separate project.

let webConfig = 
    { 
        defaultConfig with 
            homeFolder = Some rootPath
            mimeTypesMap = mimeTypes
    }

I use the default server config that is shipped with Suave and only adjust the needed parts.

let getPhone phoneName =
    printfn "getting data for phone: %s" phoneName

    let phoneFolder = Path.Combine(rootPath, "app", "phones")
    browseFile phoneFolder phoneName

Function to return the data for the phones. It is stored as json- Files.

let api =
    choose
        [
            GET >>= choose 
                [
                    pathScan "/api/phones/%s" (fun s -> getPhone s)
                ]
        ]

In Suave you have WebParts which can react on the different urls. Here I react on all calls to /api/phones/ and call the function to get the phone data.

let getFile name =
    let rootPath = webConfig.homeFolder.Value
    browseFile rootPath name

Function to return the file content of the requested file.

let angularApp =
    choose
        [ GET >>= choose
            [
                path "/" >>=  redirect "app/index.html"
                pathScan "/%s" (fun s -> getFile s)                
            ]
        ]

The AngularJS app is still located under /app/. So I make a redirect, when you simple browse to the root page. For all other request to /, I simple return the requested file.

let app =
    choose
        [
            api
            angularApp
        ]

The choose web part executes every web part in the list and only stops if someone is successful. In this case, the correct order is important, because the /%s in the angularApp WebPart will eat every request. If the order is changed, no call is ever made to the api Webpart.

[<EntryPoint>]
let main argv = 
     
    let cts = new CancellationTokenSource()
    let startingServer, shutdownServer = startWebServerAsync webConfig app
 
    Async.Start(shutdownServer, cts.Token)
 
    startingServer |> Async.RunSynchronously |> printfn "started: %A"
 
    printfn "Press Enter to stop"
    Console.Read() |> ignore
 
    cts.Cancel()
 
    0

At the end, I simply start the the webserver with the individual web config and the app WebPart.

When you start the programm and open http://localhost:8083/ you can see the PhoneCat Tutorial App hosted by Suave.IO.

So have fun with it. 😉

F# Raw SQL Access

With my start in working with F#, I came to the point to access a database. When you read my last post, you know that an ORM was out of question. What me really surprised at the end, was that the code is much less than every other DB access I wrote.

So how does it look?

The general idea was, to first create a record that holds the SQL statement and the parameters, that gets then executed.

The record enables you to write tests for your database access.

For accessing the ADO.Net functions, I use FsSql.

So beginn with the entity with represents one row of the table:

type Weather() =
    member val Id = 0 with get,set
 
    member val LocationId = Guid.Empty with get,set
    member val DataFrom = DateTimeOffset.MinValue with get,set
    member val DataTo = DateTimeOffset.MinValue with get,set
    member val Temperature = 0.0m with get,set
    member val Humidity = 0.0m with get,set
    member val Rain = 0.0m with get,set
    member val WindSpeed = 0.0m with get,set
    member val WindDirection = 0.0m with get,set
    member val Clouds = 0.0m with get,set
    member val Pressure = 0.0m with get,set

It’s a simple store for weather data for a location.

To insert an entry to the table, we need a function to get the wrapper record:

let private insert (weather : Weather ) =
{
    query = "INSERT INTO \"intersect\".\"Weather\"(
                         locationid, datafrom, datato, temperature, humidity, rain,
                         windspeed, winddirection, clouds, pressure)
             VALUES (@locationid, @datafrom, @datato, @temperature, @humidity, @rain,
                     @windspeed, @winddirection, @clouds, @pressure);
             RETURNING id;";
    parameters = [
                 P("@locationid", weather.LocationId);
                 P("@datafrom", weather.DataFrom);
                 P("@datato", weather.DataTo);
                 P("@temperature", weather.Temperature);
                 P("@humidity", weather.Humidity);
                 P("@rain", weather.Rain);
                 P("@windspeed", weather.WindSpeed);
                 P("@winddirection", weather.WindDirection);
                 P("@clouds", weather.Clouds);
                 P("@pressure", weather.Pressure);
             ]
}

The wrapper has the query as string and a list of SqlParameters.

type changeQueryObject =
{
    query : string;
    parameters : Sql.Parameter list;
}

The P- function is a shortcut for the FsSql Sql.Parameter.make.

To execute the insert statement, we pass the changeQueryObject to following function:

let executeScalar (queryObj : changeQueryObject) =
    sql.ExecScalar queryObj.query queryObj.parameters

With the “RETURNING id” at the end of the SQL statement and executing it with ExecScalar, we get the primary key of the new entry. Everything we want after the insert.

The same works for updates:

let private update (weather : Weather ) =
{
    query = "UPDATE \"intersect\".\"Weather\"
             SET locationid=@locationid, datafrom=@datafrom, datato=@datato, temperature=@temperature, humidity=@humidity,
             rain=@rain, windspeed=@windspeed, winddirection=@winddirection, clouds=@clouds, pressure=@pressure
             WHERE id=@id;
             SELECT @id;
             ";
    parameters = [
                    P("@id", weather.Id);
                    P("@locationid", weather.LocationId);
                    P("@datafrom", weather.DataFrom);
                    P("@datato", weather.DataTo);
                    P("@temperature", weather.Temperature);
                    P("@humidity", weather.Humidity);
                    P("@rain", weather.Rain);
                    P("@windspeed", weather.WindSpeed);
                    P("@winddirection", weather.WindDirection);
                    P("@clouds", weather.Clouds);
                    P("@pressure", weather.Pressure);
                 ]
}

And with some little pattern matching, we get a nice save function:

let save (weather : Weather) =
    match weather.Id with
    | 0 -> insert weather
    | _ -> update weather

But what is with querying and some filtering? Now F# plays it’s strength.

For that we need an little changed query wrapper:

type selectQueryObject<'T> =
{
    query : string;
    parameters : Sql.Parameter list;
    deserialisation : (IDataRecord -> 'T) option;
}

The new field “deserialisation” holds an function to get an instance of the entity from a datarecord. This is the one for the weather entity.

let asWeather (r: IDataRecord) =
    new Weather(Id = (r?id).Value, LocationId = (r?LocationId).Value, DataFrom = (r?DataFrom).Value,
    DataTo = (r?DataTo).Value, Temperature = (r?Temperature).Value,
    Humidity = (r?Humidity).Value, Rain = (r?Rain).Value, WindSpeed = (r?WindSpeed).Value,
    WindDirection = (r?WindDirection).Value, Clouds = (r?Clouds).Value, Pressure = (r?Pressure).Value)

How looks the querying function?

let search (fc : WeatherSearchRequest) =
    let query = "SELECT id, locationid, datafrom, datato, temperature, humidity, rain, windspeed, winddirection, clouds, pressure FROM \"intersect\".\"Weather\""
 
    let locationIdPara = w "locationid" "@locationid" "=" fc.LocationId
    let fomPara = w "datafrom" "@datafrom" ">=" fc.From
    let toPara = w "datato" "@datato" "<=" fc.To
    let datetimeFromPara = w "datafrom" "@datafrom" ">=" fc.DateTime
    let datetimeToPara = w "datato" "@datato" "<=" fc.DateTime
 
    Some asWeather
    |> emptyQueryPart
    |> combineAnd locationIdPara
    |> combineAnd fomPara
    |> combineAnd toPara
    |> combineAnd datetimeFromPara
    |> combineAnd datetimeToPara
    |> combineQueryParts query

The WeatherSearchRequest is a POCO which holds the various filter values.

Every possible part of the where- Statement is wrapped in an queryPart record.

type queryPart =
{
    where : string;
    parameter : Sql.Parameter option
}

The w- function (for Where) creates such a record:

let private w<'a when 'a : (new : unit -> 'a) and 'a : struct and 'a :> System.ValueType> column parametername operator (value : System.Nullable<'a>) =
    match value.HasValue with
    | true -> { where = sprintf "%s %s %s" column operator parametername; parameter = Some <| P(parametername, value.Value)}
    | false -> {where = ""; parameter = Option<Sql.Parameter>.None}

When the filter value is set, it create a filled queryPart, else we get an empty one.

But what do we do with it now? Let’s look at the next code block.

Some asWeather
|> emptyQueryPart
...

This is to create a selectQueryObject with setted deserialisation function. The selectQueryObject is then piped to multiple combineAnd functions.

...
|> combineAnd locationIdPara
|> combineAnd fomPara
|> combineAnd toPara
|> combineAnd datetimeFromPara
|> combineAnd datetimeToPara
...

let private combineAnd<'T> (x : queryPart) (qo : selectQueryObject<'T>) =
    match (x.parameter, qo.query) with
    | (Some xpart, "") -> { query = x.where; parameters = [x.parameter.Value]; deserialisation = qo.deserialisation}
    | (Some xpart, _) -> {query = qo.query + " AND " + x.where; parameters = x.parameter.Value :: qo.parameters; deserialisation = qo.deserialisation}
    | (None, _) -> { query = qo.query; parameters = qo.parameters; deserialisation = qo.deserialisation}

Once again, we use some pattern matching.

The first case is hit, when we have a filled queryPart but an empty selectQueryObject. So when the first filter criteria is hit.

The second case is hit, when we add the other filter criteria to the selectQueryObject.

The last one is hit, when we have already some data in the selectQueryObject, but the queryPart is empty. So when we have already some filter criteria added, but the current one is not set.

At the end, we pipe the selectQueryObject to the combineQueryParts function.

...
|> combineQueryParts query
...
 
let private combineQueryParts<'T> (query :string) (qo :selectQueryObject<'T>) =
    let mutable q = query
    if (qo.query.Length > 0) then
        q <- query + " WHERE " + qo.query
 
    { query = q; parameters = qo.parameters; deserialisation = qo.deserialisation; }

This function simply combines the select- Query with the where- Statement if needed.

At the end, we get a nice filled selectQueryObject which is then passed to a general function.

let executeReader<'T> (queryObj : selectQueryObject<'T> )=
    sql.ExecReader queryObj.query queryObj.parameters
    |> Seq.ofDataReader
    |> Seq.map queryObj.deserialisation.Value

Nice, or?

The complete code at once can be found here.

So what about the testing? With the query wrapped into a record, we simply can tests the data in the record.

let checkSqlParameter (name : string) (value : obj) (parameter : Sql.Parameter) =
    obj.Equals(parameter.ParameterName,name) && obj.Equals(parameter.Value,value)
 
let checkForSqlParameter (name : string) (value : obj) parameters =
    parameters |> Seq.exists (fun i -> checkSqlParameter name value i) |> should equal true
 
let nullableFromPara = new Nullable<DateTimeOffset>(fromPara)
let nullableToPara = new Nullable<DateTimeOffset>(toPara)
let locationIdPara = new Nullable<Guid>(Guid.NewGuid())
 
[<Fact>]
let ``Check query with LocationId, From and To parameter``() =
    let fc = new WeatherSearchRequest()
    fc.LocationId <- locationIdPara
    fc.From <- nullableFromPara
    fc.To <- nullableToPara
 
    let queryObj = weather.search fc
    queryObj.parameters |> Seq.length |> should equal 3
    queryObj.parameters |> checkForSqlParameter "@datafrom" fc.From
    queryObj.parameters |> checkForSqlParameter "@datato" fc.To
    queryObj.parameters |> checkForSqlParameter "@locationid" fc.LocationId
 
    queryObj.query |> should equal "SELECT id, locationid, datafrom, datato, temperature, humidity, rain, windspeed, winddirection, clouds, pressure FROM \"intersect\".\"Weather\" WHERE locationid = @locationid AND datafrom >= @datafrom AND datato <= @datato"

When was the last time, you could test which SQL- query gets executed against your DBS? 😉

No More ORM

After working some years with different OR- Mappers (NHibernate, Entity Framework, proprietary ones, …) to build a lot of line of business applications, I came to the conclusion to trash them.

So how did I come to this conclusion? Yes, ORMs make it on the first look easy to access your database and query them. That is it all what you get. But you also get some problems, that were not there before.

Some of them are here:

Generated Queries and Performance

The ORM generates the SQL Query for you. But you will never get the best sql query to get your data. You do not want to fetch all columns of an table? Have fun! Transactions Every ORM has his own idea how to represent a transaction. Why? Plain db transactions are really easy to understand. Why add another complication to your application? When you want a stable and predictable LoB- Application, you want to know exactly when which changes are commited to the database.

Predictability

Do you know, when which query is executed against your database? How often did you looked for bugs where data not returned by a query, that are caused, because you forget to flush somewhere a session?

Additional Query language

Most ORM have their own query language or uses LINQ for the queries. LINQ is very nice to query your objects, but when you create more complicated queries, it gets really quick ugly. Group by- Queries? Outer Joins? Hurray! Why add another abstraction when you have already a Structured Query Language? How often did you fix code of some unexperienced developers that had performance issues, because the queries are done in memory or results in hundred small queries against the db? Simply because the code looks the same and they did not think about it.

Support for different DBS, but no one to 100%

When did you the last time tried to access a DBS specific feature? You want to use PostGIS? Filestreams on MS SQL? Or much simplier, use timestamptz on PostgreSQL? No chance. Why is this? Simply because the ORM have to provide an API over all supported DBS. And when did you wrote a LoB application that have to run on different DBS? I haven’t?

Additional problems that you did not see coming

This week I was talking with a collegue about some problems with our used ORM and the build queue. WTF? What does the ORM have to do with the build queue? This post should not be a rant about ORM or dictate to everyone to not use them anymore. I recognized in the last weeks, that the people do not think anymore about the question, if they want to use an ORM. They simply say, “Yeah, let’s use X as ORM”, but did not think if they can achieve their requirements with it. So see this post as an starting point to rethink your technology choices.

ORM are a part of your toolchain, you do not have to fix every problem with your beloved tool. Use the right tool for the job.