from pyspark import SparkConf conf = SparkConf().setAppName("my_app").setMaster("local[*]")
conf.set("spark.executor.instances", 2) conf.set("spark.executor.memory", "2g")This will set up two executor instances and each executor will have a maximum memory limit of 2 GB. Library: pyspark - PySpark is the Python API for Apache Spark, the open-source big data processing framework. It provides a simple interface for distributed computing and data processing, making it easier to work with large datasets.