Skip to content
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Clarify the docs for cross build and untabify because mix of tabs/spa…
…ces is meeps
  • Loading branch information
holdenk committed May 22, 2020
commit f7fdddcfeb2276491b89a64cd9b89a2692f29de3
28 changes: 13 additions & 15 deletions bin/docker-image-tool.sh
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,6 @@ function docker_push {
if [ $? -ne 0 ]; then
error "Failed to push $image_name Docker image."
fi
if [ "${CROSS_BUILD}" != "false" ]; then
docker buildx push "$(image_ref ${image_name})"
fi
else
echo "$(image_ref ${image_name}) image not found. Skipping push for this image."
fi
Expand Down Expand Up @@ -196,12 +193,12 @@ function build {
-t $(image_ref spark-py) \
-f "$PYDOCKERFILE" .)
if [ $? -ne 0 ]; then
error "Failed to build PySpark Docker image, please refer to Docker build output for details."
error "Failed to build PySpark Docker image, please refer to Docker build output for details."
fi
if [ "${CROSS_BUILD}" != "false" ]; then
(cd $(img_ctx_dir pyspark) && docker buildx build $ARCHS $NOCACHEARG "${BINDING_BUILD_ARGS[@]}" \
-t $(image_ref spark-py) \
-f "$PYDOCKERFILE" .)
-t $(image_ref spark-py) \
-f "$PYDOCKERFILE" .)
fi
fi

Expand All @@ -214,8 +211,8 @@ function build {
fi
if [ "${CROSS_BUILD}" != "false" ]; then
(cd $(img_ctx_dir sparkr) && docker buildx build $ARCHS $NOCACHEARG "${BINDING_BUILD_ARGS[@]}" \
-t $(image_ref spark-r) \
-f "$RDOCKERFILE" .)
-t $(image_ref spark-r) \
-f "$RDOCKERFILE" .)
fi
fi
}
Expand All @@ -233,24 +230,24 @@ Builds or pushes the built-in Spark Docker image.

Commands:
build Build image. Requires a repository address to be provided if the image will be
pushed to a different registry.
pushed to a different registry.
push Push a pre-built image to a registry. Requires a repository address to be provided.

Options:
-f file Dockerfile to build for JVM based Jobs. By default builds the Dockerfile shipped with Spark.
-p file (Optional) Dockerfile to build for PySpark Jobs. Builds Python dependencies and ships with Spark.
Skips building PySpark docker image if not specified.
Skips building PySpark docker image if not specified.
-R file (Optional) Dockerfile to build for SparkR Jobs. Builds R dependencies and ships with Spark.
Skips building SparkR docker image if not specified.
Skips building SparkR docker image if not specified.
-r repo Repository address.
-t tag Tag to apply to the built image, or to identify the image to be pushed.
-m Use minikube's Docker daemon.
-n Build docker image with --no-cache
-u uid UID to use in the USER directive to set the user the main Spark process runs as inside the
resulting container
-x Use docker buildx to cross build
resulting container
-X Use docker buildx to cross build. Automatically pushes.
-b arg Build arg to build or push the image. For multiple build args, this option needs to
be used separately for each build arg.
be used separately for each build arg.

Using minikube when building images will do so directly into minikube's Docker daemon.
There is no need to push the images into minikube in that case, they'll be automatically
Expand All @@ -277,7 +274,8 @@ Examples:

- Build and push JDK11-based image for multiple archs to docker.io/myrepo
$0 -r docker.io/myrepo -t v3.0.0 -X -b java_image_tag=11-jre-slim build
$0 -r docker.io/myrepo -t v3.0.0 -X push
# Note: buildx, which does cross building, needs to do the push during build
# So there is no seperate push step with -X

EOF
}
Expand Down