Apache Spark is written in scala, scala compiles to Java and runs inside a Java virtual machine. The spark-dotnet driver runs dotnet code and calls spark functionality, so how does that work?
There are two paths to run dotnet code with spark, the first is the general case which I will describe here, the second is UDF's which I will explain in a later post as it is slightly more involved.