Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Change getSparkOrYarnConfig value order
  • Loading branch information
wangyum committed Mar 17, 2018
commit 06bb6f8a7df62bbf9fc579c2130b6a8a32b326e9
10 changes: 5 additions & 5 deletions core/src/main/scala/org/apache/spark/util/Utils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2433,12 +2433,12 @@ private[spark] object Utils extends Logging {
* if the key is not set in the Hadoop configuration.
*/
def getSparkOrYarnConfig(conf: SparkConf, key: String, default: String): String = {
val sparkValue = conf.get(key, default)
if (conf.get(SparkLauncher.SPARK_MASTER, null) == "yarn"
&& (key.startsWith("spark.hadoop.") || !key.startsWith("spark."))) {
new YarnConfiguration(SparkHadoopUtil.get.newConfiguration(conf)).get(key, sparkValue)
if (conf.contains(key)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The scaladoc above is now wrong, since it still refers to the old order of precedence. Otherwise looks ok.

conf.get(key, default)
} else if (conf.get(SparkLauncher.SPARK_MASTER, null) == "yarn") {
new YarnConfiguration(SparkHadoopUtil.get.newConfiguration(conf)).get(key, default)
} else {
sparkValue
default
}
}

Expand Down