-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-4923][REPL] Add Developer API to REPL to allow re-publishing the REPL jar #4034
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Can one of the admins verify this patch? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should this be marked as a devloper API also, if it's exposed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, do you use the DeveloperApi for entire classes? Or are you saying that since the purpose I described for SparkCommandLine was to retrieve settings?
I didn't mark SparkILoop, SparkIMain, SparkHelper (forced to be public due to packaging), SparkJLineCompletion, or SparkCommandLine as DeveloperApi on the class level. I was assuming that internal markings of DeveloperApi conveyed that. I can go back and do that if that's the way things are normally done.
Or, I can just add it to SparkCommandLine since it is the only one without any internal DeveloperApi marks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually out of abundance of caution we mark it on the class level as well, even if everything bytecode-exposed inside of the class is also marked. We tend to err on the side of over communication with this.
|
Jenkins, test this please. Thanks a lot @rcsenkbeil for such a thorough treatment of this! LGTM in general, I made a minor comment. |
|
Test build #25617 has started for PR 4034 at commit
|
|
Test build #25617 has finished for PR 4034 at commit
|
|
Test PASSed. |
|
Jenkins, test this please. |
|
Test build #25637 has started for PR 4034 at commit
|
|
Test build #25637 has finished for PR 4034 at commit
|
|
Test FAILed. |
|
Whoops, missed adding an import for the DeveloperApi on SparkCommandLine. Give me a minute to add it. |
|
Took a little longer on an old computer, but I made sure it built successfully on my end this time. Should hopefully be good to go now. |
|
Jenkins ok to test. Jenkins, test this please. |
|
Test build #25652 has started for PR 4034 at commit
|
|
Test build #25652 has finished for PR 4034 at commit
|
|
Test PASSed. |
|
Thanks Chip - I will pull this in. After some more thought on this, I'm just going to pull this into master for 1.3+ and for 1.2 we'll just publish the original REPL with all the open permissions. My concern is giving people some time to adjust to the new locked down permissions before we ship it in a release. |
|
@pwendell, that sounds like a good decision. Thanks for letting me know! |
As requested in SPARK-4923, I've provided a rough DeveloperApi for the repl. I've only done this for Scala 2.10 because it does not appear that Scala 2.11 is implemented. The Scala 2.11 repl still has the old
scala.tools.nscpackage and the SparkIMain does not appear to have the class server needed for shipping code over (unless this functionality has been moved elsewhere?). I also left alone theExecutorClassLoaderandConstructorCleaneras I have no experience working with those classes.This marks the majority of methods in
SparkIMainas private with a few special cases being private[repl] as other classes within the same package access them. Any public method has been marked with@DeveloperApias suggested by @pwendell and I took the liberty of writing up a Scaladoc for each one to further elaborate their usage.As the Scala 2.11 REPL conforms to JSR-223, the Spark Kernel uses the SparkIMain of Scala 2.10 in the same manner. So, I've taken care to expose methods predominately related to necessary functionality towards a JSR-223 scripting engine implementation.
Additional functionality that I marked as exposed included the following:
Aside from
SparkIMain, I updated other classes/traits and their methods in the repl package to be private/package protected where possible. A few odd cases (like the SparkHelper being in the scala.tools.nsc package to expose a private variable) still exist, but I did my best at labelling them.SparkCommandLinehas proven useful to extract settings andSparkJLineCompletionhas proven to be useful in implementing auto-completion in the Spark Kernel project. Other than those - andSparkIMain- my experience has yielded that other classes/methods are not necessary for interactive applications taking advantage of the REPL API.Tested via the following:
Also did a quick verification that I could start the shell and execute some code: