spark and hadoop hdfs 1Install Spark 3.2.x On Windows 10 2PySpark Create a DataFrame 3Read and Query a Parquet File in a Spark Shell 4Recursive HDFS Directory Sizes Script 5Spark Write DataFrame Different Compression Codecs