Skip to content
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
59 commits
Select commit Hold shift + click to select a range
9bbfe6e
[SPARK-21786][SQL] When acquiring 'compressionCodecClassName' in 'Par…
fjh100456 Dec 25, 2017
48cf108
[SPARK-21786][SQL] When acquiring 'compressionCodecClassName' in 'Par…
fjh100456 Dec 25, 2017
5dbd3ed
spark.sql.parquet.compression.codec[SPARK-21786][SQL] When acquiring …
fjh100456 Dec 25, 2017
5124f1b
spark.sql.parquet.compression.codec[SPARK-21786][SQL] When acquiring …
fjh100456 Dec 25, 2017
6907a3e
Make comression codec take effect in hive table writing.
fjh100456 Dec 25, 2017
67e40d4
Modify test
fjh100456 Dec 25, 2017
e2526ca
Separate the pr
fjh100456 Dec 26, 2017
8ae86ee
Add test case with the table containing mixed compression codec
fjh100456 Dec 26, 2017
94ac716
Revert back
fjh100456 Dec 26, 2017
43e041f
Revert back
fjh100456 Dec 26, 2017
ee0c558
Add a new line at the of file
fjh100456 Dec 26, 2017
e9f705d
Fix scala style
fjh100456 Jan 2, 2018
d3aa7a0
Fix scala style
fjh100456 Jan 2, 2018
5244aaf
[SPARK-22897][CORE] Expose stageAttemptId in TaskContext
advancedxy Jan 2, 2018
b96a213
[SPARK-22938] Assert that SQLConf.get is accessed only on the driver.
juliuszsompolski Jan 3, 2018
a05e85e
[SPARK-22934][SQL] Make optional clauses order insensitive for CREATE…
gatorsmile Jan 3, 2018
b962488
[SPARK-20236][SQL] dynamic partition overwrite
cloud-fan Jan 3, 2018
27c949d
[SPARK-22932][SQL] Refactor AnalysisContext
gatorsmile Jan 2, 2018
79f7263
[SPARK-22896] Improvement in String interpolation
chetkhatri Jan 3, 2018
a51212b
[SPARK-20960][SQL] make ColumnVector public
cloud-fan Jan 3, 2018
f51c8fd
[SPARK-22944][SQL] improve FoldablePropagation
cloud-fan Jan 4, 2018
1860a43
[SPARK-22933][SPARKR] R Structured Streaming API for withWatermark, t…
felixcheung Jan 4, 2018
a7cfd6b
[SPARK-22950][SQL] Handle ChildFirstURLClassLoader's parent
yaooqinn Jan 4, 2018
eb99b8a
[SPARK-22945][SQL] add java UDF APIs in the functions object
cloud-fan Jan 4, 2018
1f5e354
[SPARK-22939][PYSPARK] Support Spark UDF in registerFunction
gatorsmile Jan 4, 2018
bcfeef5
[SPARK-22771][SQL] Add a missing return statement in Concat.checkInpu…
maropu Jan 4, 2018
cd92913
[SPARK-21475][CORE][2ND ATTEMPT] Change to use NIO's Files API for ex…
jerryshao Jan 4, 2018
bc4bef4
[SPARK-22850][CORE] Ensure queued events are delivered to all event q…
Jan 4, 2018
2ab4012
[SPARK-22948][K8S] Move SparkPodInitContainer to correct package.
Jan 4, 2018
84707f0
[SPARK-22953][K8S] Avoids adding duplicated secret volumes when init-…
liyinan926 Jan 4, 2018
ea9da61
[SPARK-22960][K8S] Make build-push-docker-images.sh more dev-friendly.
Jan 5, 2018
158f7e6
[SPARK-22957] ApproxQuantile breaks if the number of rows exceeds MaxInt
juliuszsompolski Jan 5, 2018
145820b
[SPARK-22825][SQL] Fix incorrect results of Casting Array to String
maropu Jan 5, 2018
5b524cc
[SPARK-22949][ML] Apply CrossValidator approach to Driver/Distributed…
MrBago Jan 5, 2018
f9dcdbc
[SPARK-22757][K8S] Enable spark.jars and spark.files in KUBERNETES mode
liyinan926 Jan 5, 2018
fd4e304
[SPARK-22961][REGRESSION] Constant columns should generate QueryPlanC…
adrian-ionescu Jan 5, 2018
0a30e93
[SPARK-22940][SQL] HiveExternalCatalogVersionsSuite should succeed on…
bersprockets Jan 5, 2018
d1f422c
[SPARK-13030][ML] Follow-up cleanups for OneHotEncoderEstimator
jkbradley Jan 5, 2018
55afac4
[SPARK-22914][DEPLOY] Register history.ui.port
gerashegalov Jan 6, 2018
bf85301
[SPARK-22937][SQL] SQL elt output binary for binary inputs
maropu Jan 6, 2018
3e3e938
[SPARK-22960][K8S] Revert use of ARG base_image in images
liyinan926 Jan 6, 2018
7236914
[SPARK-22930][PYTHON][SQL] Improve the description of Vectorized UDFs…
icexelloss Jan 6, 2018
e6449e8
[SPARK-22793][SQL] Memory leak in Spark Thrift Server
Jan 6, 2018
0377755
[SPARK-21786][SQL] When acquiring 'compressionCodecClassName' in 'Par…
fjh100456 Jan 6, 2018
b66700a
[SPARK-22901][PYTHON][FOLLOWUP] Adds the doc for asNondeterministic f…
HyukjinKwon Jan 6, 2018
f9e7b0c
[HOTFIX] Fix style checking failure
gatorsmile Jan 6, 2018
285d342
[SPARK-22973][SQL] Fix incorrect results of Casting Map to String
maropu Jan 7, 2018
bd1a80a
Merge remote-tracking branch 'upstream/branch-2.3'
fjh100456 Jan 8, 2018
584cdc2
Merge pull request #2 from apache/master
fjh100456 Jan 8, 2018
5b150bc
Fix test issue
fjh100456 Jan 8, 2018
2337edd
Merge pull request #1 from apache/master
fjh100456 Jan 8, 2018
43e7eb5
Merge branch 'master' of https://github.com/fjh100456/spark
fjh100456 Jan 9, 2018
4b89b44
consider the precedence of `hive.exec.compress.output`
fjh100456 Jan 11, 2018
6cf32e0
Resume to private and add public function
fjh100456 Jan 19, 2018
365c5bf
Resume to private and add public function
fjh100456 Jan 19, 2018
99271d6
Fix test issue
fjh100456 Jan 19, 2018
2b9dfbe
Fix test issue
fjh100456 Jan 20, 2018
5b5e1df
Fix style issue
fjh100456 Jan 20, 2018
118f788
Fix style issue
fjh100456 Jan 20, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
[SPARK-22914][DEPLOY] Register history.ui.port
## What changes were proposed in this pull request?

Register spark.history.ui.port as a known spark conf to be used in substitution expressions even if it's not set explicitly.

## How was this patch tested?

Added unit test to demonstrate the issue

Author: Gera Shegalov <[email protected]>
Author: Gera Shegalov <[email protected]>

Closes #20098 from gerashegalov/gera/register-SHS-port-conf.

(cherry picked from commit ea95683)
Signed-off-by: Marcelo Vanzin <[email protected]>
  • Loading branch information
gerashegalov authored and Marcelo Vanzin committed Jan 6, 2018
commit 55afac4e7b4f655aa05c5bcaf7851bb1e7699dba
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ import org.eclipse.jetty.servlet.{ServletContextHandler, ServletHolder}

import org.apache.spark.{SecurityManager, SparkConf}
import org.apache.spark.deploy.SparkHadoopUtil
import org.apache.spark.deploy.history.config.HISTORY_SERVER_UI_PORT
import org.apache.spark.internal.Logging
import org.apache.spark.internal.config._
import org.apache.spark.status.api.v1.{ApiRootResource, ApplicationInfo, UIRoot}
Expand Down Expand Up @@ -276,7 +277,7 @@ object HistoryServer extends Logging {
.newInstance(conf)
.asInstanceOf[ApplicationHistoryProvider]

val port = conf.getInt("spark.history.ui.port", 18080)
val port = conf.get(HISTORY_SERVER_UI_PORT)

val server = new HistoryServer(conf, provider, securityManager, port)
server.bind()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,4 +44,9 @@ private[spark] object config {
.bytesConf(ByteUnit.BYTE)
.createWithDefaultString("10g")

val HISTORY_SERVER_UI_PORT = ConfigBuilder("spark.history.ui.port")
.doc("Web UI port to bind Spark History Server")
.intConf
.createWithDefault(18080)

}
Original file line number Diff line number Diff line change
Expand Up @@ -427,11 +427,8 @@ private[spark] class ApplicationMaster(args: ApplicationMasterArguments) extends
uiAddress: Option[String]) = {
val appId = client.getAttemptId().getApplicationId().toString()
val attemptId = client.getAttemptId().getAttemptId().toString()
val historyAddress =
_sparkConf.get(HISTORY_SERVER_ADDRESS)
.map { text => SparkHadoopUtil.get.substituteHadoopVariables(text, yarnConf) }
.map { address => s"${address}${HistoryServer.UI_PATH_PREFIX}/${appId}/${attemptId}" }
.getOrElse("")
val historyAddress = ApplicationMaster
.getHistoryServerAddress(_sparkConf, yarnConf, appId, attemptId)

val driverUrl = RpcEndpointAddress(
_sparkConf.get("spark.driver.host"),
Expand Down Expand Up @@ -834,6 +831,16 @@ object ApplicationMaster extends Logging {
master.getAttemptId
}

private[spark] def getHistoryServerAddress(
sparkConf: SparkConf,
yarnConf: YarnConfiguration,
appId: String,
attemptId: String): String = {
sparkConf.get(HISTORY_SERVER_ADDRESS)
.map { text => SparkHadoopUtil.get.substituteHadoopVariables(text, yarnConf) }
.map { address => s"${address}${HistoryServer.UI_PATH_PREFIX}/${appId}/${attemptId}" }
.getOrElse("")
}
}

/**
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.apache.spark.deploy.yarn

import org.apache.hadoop.yarn.conf.YarnConfiguration

import org.apache.spark.{SparkConf, SparkFunSuite}

class ApplicationMasterSuite extends SparkFunSuite {

test("history url with hadoop and spark substitutions") {
val host = "rm.host.com"
val port = 18080
val sparkConf = new SparkConf()

sparkConf.set("spark.yarn.historyServer.address",
"http://${hadoopconf-yarn.resourcemanager.hostname}:${spark.history.ui.port}")
val yarnConf = new YarnConfiguration()
yarnConf.set("yarn.resourcemanager.hostname", host)
val appId = "application_123_1"
val attemptId = appId + "_1"

val shsAddr = ApplicationMaster
.getHistoryServerAddress(sparkConf, yarnConf, appId, attemptId)

assert(shsAddr === s"http://${host}:${port}/history/${appId}/${attemptId}")
}
}