Spark Installation in Windows caused two errors to start successfully without any errors.
1. Missing binary file WINUTILS.EXE
Solution : Download the latest WINUTILS.EXE and place that in %HADOOP_HOME%\bin\winutils.exe
Note: winutils.exe should be more carefull in downloading the LATEST one
prechecks before downloading :
1. check whether windows is 32 or 64 bit
2.checking the winutils downloadable is 32 or 64 bit compatable
Because 32 bit will work for 32 bit and 64 bit will work for 64 bit only
2. ava.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /t
mp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
After donwloading and importing the winutils.exe in bin folder .
create /tmp/hive folder in C:\ using command line in admin mode
and run the below commands
%HADOOP_HOME%\BIN\WINUTILS.EXE chmod -R 777 \tmp\hive
%HADOOP_HOME%\BIN\WINUTILS.EXE ls \tmp\hive
At last you must contain C:\tmp\hive with full access to every user
in my case
SPARK_HOME = C:\Spark\bin
HADOOP_HOME=C:\Spark
Thanks for reading