Skip to content

[SPARK-56158][CORE] Support limitActiveProcessorCount in local mode#55132

Open
jzhuge wants to merge 1 commit intoapache:masterfrom
jzhuge:SPARK-56158
Open

[SPARK-56158][CORE] Support limitActiveProcessorCount in local mode#55132
jzhuge wants to merge 1 commit intoapache:masterfrom
jzhuge:SPARK-56158

Conversation

@jzhuge
Copy link
Copy Markdown
Member

@jzhuge jzhuge commented Apr 1, 2026

What changes were proposed in this pull request?

Extend spark.driver.limitActiveProcessorCount.enabled to local mode. Previously the flag only worked in YARN cluster mode (yarn/Client.scala).

In local mode the driver runs in the same JVM as spark-submit, so the JVM flag must be injected before the JVM starts. The injection is added in SparkSubmitCommandBuilder, which builds the JVM command line prior to exec:

  • Inject -XX:ActiveProcessorCount=<spark.driver.cores> when the master is local or local[...] (local-cluster is excluded — it spawns separate worker processes)
  • Skip if the user already set -XX:ActiveProcessorCount in driver Java options
  • Handle --driver-cores in OptionParser (consistent with --driver-memory) so it is available in the effective config

Why are the changes needed?

spark.driver.limitActiveProcessorCount.enabled had no effect in local mode. Users on multi-core machines couldn't limit driver CPU with the same mechanism available in YARN.

Does this PR introduce any user-facing change?

Yes. spark.driver.limitActiveProcessorCount.enabled=true now takes effect in local mode: -XX:ActiveProcessorCount=<spark.driver.cores> (default 1) is injected into the JVM command by spark-submit.

How was this patch tested?

Added testLimitActiveProcessorCountLocalMode in SparkSubmitCommandBuilderSuite covering: flag disabled, enabled with default/custom cores (--conf and --driver-cores), YARN client mode (no injection), local-cluster (no injection), user-supplied flag (no duplication), invalid spark.driver.cores (throws).

Was this patch authored or co-authored using generative AI tooling?

Yes

@jzhuge jzhuge marked this pull request as ready for review April 2, 2026 06:12
@jzhuge jzhuge marked this pull request as draft April 3, 2026 05:15
In local mode, the driver runs in the same JVM as spark-submit.
When spark.driver.limitActiveProcessorCount.enabled=true,
inject -XX:ActiveProcessorCount=<spark.driver.cores> in SparkSubmitCommandBuilder,
unless the user already set it in driver Java options.
@jzhuge jzhuge marked this pull request as ready for review April 3, 2026 21:24
@jzhuge
Copy link
Copy Markdown
Member Author

jzhuge commented Apr 4, 2026

Good to go

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant