Apache Spark - Install Apache Spark 3.x On Windows 10 |Spark Tutorial
HTML-код
- Опубликовано: 5 ноя 2023
- This video on Spark installation will let you learn how to install and setup Apache Spark on Windows. First, you will see how to download the latest release of Spark. Then you will set up a winutils executable file which is required for Spark. You will also see how to setup environment variables as part of this installation .
Spark Download:spark.apache.org/downloads.html
Winutils download: github.com/steveloughran/winu...
🙏🏻SUPPORT THE CHANNEL🙏🏻
Buy me a coffee: ko-fi.com/bigtechtalk
Subscribe: bit.ly/2A2h6sJ
Facebook: / bigtechtalk
Telegram: t.me/bigtechtalk
#bigtechtalk
#Spark
#Install
The most usefull video I have seen. Thanks sir !
You are welcome
Thanks! It perfectly every step!
You're welcome!
amzaing, steps are work like charm !!
thanks
Very good video is Awesome! Thank you!!
thanks for your comment.
thank uu so mush
You're welcome!
how to fix
Error: Could not find or load main class org.apache.spark.launcher.Main
I found this error when trying to open pyspark I followed all the video steps is there anything wrong with the command?
scala> SPARK_HOME/bin/pyspark
:23: error: not found: value SPARK_HOME
SPARK_HOME/bin/pyspark
^
:23: error: not found: value pyspark
SPARK_HOME/bin/pyspark
Hello i get this error: The filename, directory name, or volume label syntax is incorrect.
how to fix:
The system cannot find the path specified.
Hi!
I have followed each and every step
While running the spark-shell in command prompt I am getting an error 'spark-shell' is not recognized as an internal or external command, operable program or batch file'
Please help me out
Hi @pratikmuradi491
This error only comes if you environment variable are not properly set. Request you to check environment variable.
help me please, this paragraph
val df = spark.read.fomat("csv").option("header",true).load("D:\\SampleData\\input\\country.csv")
:22: error: value fomat is not a member of org.apache.spark.sql.DataFrameReader
Hi @eyes9716
It should be format not fomat
val df = spark.read.format("csv").option("header",true).load("D:\\SampleData\\input\\country.csv")
have done this too many time but still get error each time i try to install pyspark
Hi @louisizuchi1626
what the error ??
C:\Users\srinivas reddy>spark-shell
'spark-shell' is not recognized as an internal or external command,
operable program or batch file.
Hi @srinivassri7902
This kind of issue comes when your environment variable is not set properly. Can you have a look to you environment variable.
why its unsafe to visit
Hi Priyanshu,
Which site or link are you talking about ??
github link on description showing unsafe to visit but now its all good , i have installed from git and its working fine thanku for the tutorial really help me