Skip to content
Prev Previous commit
Next Next commit
Update the image building script
  • Loading branch information
foxish committed Dec 20, 2017
commit a7e0c4c31d2ae4310b26bb29e58921a0a5246c8b
7 changes: 4 additions & 3 deletions sbin/build-push-docker-images.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,11 @@
# This script builds and pushes docker images when run from a release of Spark
# with Kubernetes support.

declare -A path=( [spark-driver]=dockerfiles/driver/Dockerfile \
[spark-executor]=dockerfiles/executor/Dockerfile )
declare -A path=( [spark-driver]=kubernetes/dockerfiles/driver/Dockerfile \
[spark-executor]=kubernetes/dockerfiles/executor/Dockerfile )

function build {
docker build -t spark-base -f dockerfiles/spark-base/Dockerfile .
docker build -t spark-base -f kubernetes/dockerfiles/spark-base/Dockerfile .
for image in "${!path[@]}"; do
docker build -t ${REPO}/$image:${TAG} -f ${path[$image]} .
done
Expand All @@ -37,6 +37,7 @@ function push {
}

function usage {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So likely not for this PR, but I'm wondering for the future what you think about the idea of extending this to package up a usable Python env so we can solve the dependency management issue as well?

Really excited to see the progress :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do already do that in our fork - and when we get PySpark and R submitted (in Spark 2.4 hopefully), we will extend this script.

echo "This script must be run from a runnable distribution of Apache Spark."
echo "Usage: ./sbin/build-push-docker-images.sh -r <repo> -t <tag> build"
echo " ./sbin/build-push-docker-images.sh -r <repo> -t <tag> push"
echo "for example: ./sbin/build-push-docker-images.sh -r docker.io/myrepo -t v2.3.0 push"
Expand Down