Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,21 @@ package org.apache.spark.repl

import scala.tools.nsc.{Settings, CompilerCommand}
import scala.Predef._
import org.apache.spark.annotation.DeveloperApi

/**
* Command class enabling Spark-specific command line options (provided by
* <i>org.apache.spark.repl.SparkRunnerSettings</i>).
*
* @example new SparkCommandLine(Nil).settings
*
* @param args The list of command line arguments
* @param settings The underlying settings to associate with this set of
* command-line options
*/
@DeveloperApi
class SparkCommandLine(args: List[String], override val settings: Settings)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be marked as a devloper API also, if it's exposed?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, do you use the DeveloperApi for entire classes? Or are you saying that since the purpose I described for SparkCommandLine was to retrieve settings?

I didn't mark SparkILoop, SparkIMain, SparkHelper (forced to be public due to packaging), SparkJLineCompletion, or SparkCommandLine as DeveloperApi on the class level. I was assuming that internal markings of DeveloperApi conveyed that. I can go back and do that if that's the way things are normally done.

Or, I can just add it to SparkCommandLine since it is the only one without any internal DeveloperApi marks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Usually out of abundance of caution we mark it on the class level as well, even if everything bytecode-exposed inside of the class is also marked. We tend to err on the side of over communication with this.

extends CompilerCommand(args, settings) {

def this(args: List[String], error: String => Unit) {
this(args, new SparkRunnerSettings(error))
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ import scala.tools.nsc.ast.parser.Tokens.EOF

import org.apache.spark.Logging

trait SparkExprTyper extends Logging {
private[repl] trait SparkExprTyper extends Logging {
val repl: SparkIMain

import repl._
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,23 @@

package scala.tools.nsc

import org.apache.spark.annotation.DeveloperApi

// NOTE: Forced to be public (and in scala.tools.nsc package) to access the
// settings "explicitParentLoader" method

/**
* Provides exposure for the explicitParentLoader method on settings instances.
*/
@DeveloperApi
object SparkHelper {
/**
* Retrieves the explicit parent loader for the provided settings.
*
* @param settings The settings whose explicit parent loader to retrieve
*
* @return The Optional classloader representing the explicit parent loader
*/
@DeveloperApi
def explicitParentLoader(settings: Settings) = settings.explicitParentLoader
}
Loading