Steps needed Getting Apache Spark running on windows involves:
Installing a JRE 8 (Java 1.8/OpenJDK 8) Downloading and extracting SPARK and setting SPARK_HOME Downloading winutils.exe and setting HADOOP_HOME If using the dotnet driver also downloading the Microsoft.Spark.Worker and setting DOTNET_WORKER_DIR if you are going to use UDF's Making sure java and %SPARK_HOME%bin are on your path There are some pretty common mistakes people make (myself included!), most common I have seen recently have been having a semi-colon in JAVA_HOME/SPARK_HOME/HADOOP_HOME or having HADOOP_HOME not point to a directory with a bin folder which contains winutils.