I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.
- Where can I find spark-defaults.conf? I want to modify it
- Am I right in setting SPARK_HOME to the installation location of pyspark
~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?