Spark 3.0.0 released for a while, in TPC-DS 30TB benchmark, Spark 3.0 is roughly two times faster than Spark 2.4 and add Java 11 support. with the anaconda help, the PySpark environment installation becomes very easy, just one line command is enought.
conda install -c conda-forge openjdk pyspark -y
it will install install Zulu OpenJDK 11.0.8, PySpark 3.0.1, of course including Apache Spark 3.0.1.
It supports 64-bit Windows, macOS, Linux.
Of course, you need install anaconda first.
This installation is very friendly for newbie.