All Spark Connect Posts
Goal of this post This post aims to show how we can create a .NET application, deploy it to Databricks, and then run a Databricks job that calls our .NET code, which uses Spark Connect to run a Spark job on the Databricks job cluster to write some data out to Azure storage.
In the previous post, I showed how to use the Range command to create a Spark DataFrame and then save it locally as a parquet file.