-
Notifications
You must be signed in to change notification settings - Fork 29k
SPARK-1565 (Addendum): Replace run-example with spark-submit.
#704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Changes from 1 commit
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -17,28 +17,10 @@ | |
| # limitations under the License. | ||
| # | ||
|
|
||
| cygwin=false | ||
| case "`uname`" in | ||
| CYGWIN*) cygwin=true;; | ||
| esac | ||
|
|
||
| SCALA_VERSION=2.10 | ||
|
|
||
| # Figure out where the Scala framework is installed | ||
| FWDIR="$(cd `dirname $0`/..; pwd)" | ||
|
|
||
| # Export this as SPARK_HOME | ||
| export SPARK_HOME="$FWDIR" | ||
|
|
||
| . $FWDIR/bin/load-spark-env.sh | ||
|
|
||
| if [ -z "$1" ]; then | ||
| echo "Usage: run-example <example-class> [<args>]" >&2 | ||
| exit 1 | ||
| fi | ||
|
|
||
| # Figure out the JAR file that our examples were packaged into. This includes a bit of a hack | ||
| # to avoid the -sources and -doc packages that are built by publish-local. | ||
| EXAMPLES_DIR="$FWDIR"/examples | ||
|
|
||
| if [ -f "$FWDIR/RELEASE" ]; then | ||
|
|
@@ -49,46 +31,31 @@ fi | |
|
|
||
| if [[ -z $SPARK_EXAMPLES_JAR ]]; then | ||
| echo "Failed to find Spark examples assembly in $FWDIR/lib or $FWDIR/examples/target" >&2 | ||
| echo "You need to build Spark with sbt/sbt assembly before running this program" >&2 | ||
| echo "You need to build Spark before running this program" >&2 | ||
| exit 1 | ||
| fi | ||
|
|
||
| SPARK_EXAMPLES_JAR_REL=${SPARK_EXAMPLES_JAR#$FWDIR/} | ||
|
|
||
| # Since the examples JAR ideally shouldn't include spark-core (that dependency should be | ||
| # "provided"), also add our standard Spark classpath, built using compute-classpath.sh. | ||
| CLASSPATH=`$FWDIR/bin/compute-classpath.sh` | ||
| CLASSPATH="$SPARK_EXAMPLES_JAR:$CLASSPATH" | ||
|
|
||
| if $cygwin; then | ||
| CLASSPATH=`cygpath -wp $CLASSPATH` | ||
| export SPARK_EXAMPLES_JAR=`cygpath -w $SPARK_EXAMPLES_JAR` | ||
| fi | ||
|
|
||
| # Find java binary | ||
| if [ -n "${JAVA_HOME}" ]; then | ||
| RUNNER="${JAVA_HOME}/bin/java" | ||
| else | ||
| if [ `command -v java` ]; then | ||
| RUNNER="java" | ||
| else | ||
| echo "JAVA_HOME is not set" >&2 | ||
| exit 1 | ||
| fi | ||
| fi | ||
| EXAMPLE_CLASS="<example-class>" | ||
| EXAMPLE_ARGS="[<example args>]" | ||
| EXAMPLE_MASTER=${MASTER:-"<master>"} | ||
|
|
||
| # Set JAVA_OPTS to be able to load native libraries and to set heap size | ||
| JAVA_OPTS="$SPARK_JAVA_OPTS" | ||
| # Load extra JAVA_OPTS from conf/java-opts, if it exists | ||
| if [ -e "$FWDIR/conf/java-opts" ] ; then | ||
| JAVA_OPTS="$JAVA_OPTS `cat $FWDIR/conf/java-opts`" | ||
| if [ -n "$1" ]; then | ||
| EXAMPLE_CLASS="$1" | ||
| shift | ||
| fi | ||
| export JAVA_OPTS | ||
|
|
||
| if [ "$SPARK_PRINT_LAUNCH_COMMAND" == "1" ]; then | ||
| echo -n "Spark Command: " | ||
| echo "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@" | ||
| echo "========================================" | ||
| echo | ||
| if [ -n "$1" ]; then | ||
| EXAMPLE_ARGS="$@" | ||
| fi | ||
|
|
||
| exec "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@" | ||
| echo "NOTE: This script has been replaced with ./bin/spark-submit. Please run:" >&2 | ||
| echo | ||
| echo "./bin/spark-submit \\" >&2 | ||
| echo " --master $EXAMPLE_MASTER \\" >&2 | ||
| echo " --class $EXAMPLE_CLASS \\" >&2 | ||
| echo " $SPARK_EXAMPLES_JAR_REL \\" >&2 | ||
| echo " $EXAMPLE_ARGS" >&2 | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not to self: if we call this directly we'll need to pass |
||
| echo | ||
| exit 1 | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about this some more, I think maybe we should just call spark-submit with the supplied master instead of telling the user this stuff. Or we could call spark submit and then print out the user how to run this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I completely agree. We dont want the user to have to type out this more complicated stuff with library path and all. Just
bin/run-example org.apache.examples.spark.SparkPi <example params>In fact, now that all the examples are inside spark.examples. package, we can try to make it even simpler. To run SparkPi, one should be able to just say
./bin/run-example SparkPiThat would very simple!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great idea!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
well but then you have streaming examples and mllib examples. Do we expect the user to type in millib.MovieLensALS then? I actually think the
org.apache.examples.spark.SparkPiis more consistent with the rest (i.e. SparkSubmit). Maybe we should accept both.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea so I think if it starts with
org.apache.spark.exampleswe would pass it through. If not, we'll prependorg.apache.spark.examples.