I installed spark using the method above and it works, but I am not able to use pyspark, only spark through scala is working. I installed spark 2.4.0 because my hadoop version was 2.9.1. Is my problem because of older version or something else at play (i setup environment variables as well and python3 is also installed)
Thank you so much.. You just saved my day..
Glad to hear that!
you saved me for my class
Thank you so much for your help
Happy to help
I installed spark using the method above and it works, but I am not able to use pyspark, only spark through scala is working. I installed spark 2.4.0 because my hadoop version was 2.9.1. Is my problem because of older version or something else at play (i setup environment variables as well and python3 is also installed)
by any chance can i run a java code in this enviorment?
Tks 👌
Welcome 👍
how to remove that GC warning.