Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 5 additions & 7 deletions dev/create-release/release-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ GIT_REF=${GIT_REF:-master}
# Destination directory parent on remote server
REMOTE_PARENT_DIR=${REMOTE_PARENT_DIR:-/home/$ASF_USERNAME/public_html}

GPG="gpg --no-tty --batch"
GPG="gpg -u $GPG_KEY --no-tty --batch"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the script work without GPG_KEY , or is it required to run this script?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is required the run the script. This parameter was already required to be set (see L60) but was ignored so I'm assuming this was just a bug that happened to work for users with their Apache key as default and it was intended to be used this way previously.

NEXUS_ROOT=https://repository.apache.org/service/local/staging
NEXUS_PROFILE=d63f592e7eac0 # Profile for Spark staging uploads
BASE_DIR=$(pwd)
Expand Down Expand Up @@ -125,7 +125,7 @@ else
echo "Please set JAVA_HOME correctly."
exit 1
else
JAVA_HOME="$JAVA_7_HOME"
export JAVA_HOME="$JAVA_7_HOME"
fi
fi
fi
Expand All @@ -140,7 +140,7 @@ DEST_DIR_NAME="spark-$SPARK_PACKAGE_VERSION"
function LFTP {
SSH="ssh -o ConnectTimeout=300 -o StrictHostKeyChecking=no -i $ASF_RSA_KEY"
COMMANDS=$(cat <<EOF
set net:max-retries 1 &&
set net:max-retries 2 &&
set sftp:connect-program $SSH &&
connect -u $ASF_USERNAME,p sftp://home.apache.org &&
$@
Expand Down Expand Up @@ -345,16 +345,14 @@ if [[ "$1" == "publish-snapshot" ]]; then
# -DskipTests $SCALA_2_12_PROFILES $PUBLISH_PROFILES clean deploy

# Clean-up Zinc nailgun process
/usr/sbin/lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | xargs kill
lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | xargs kill
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, @holdenk, I thought the full path of lsof is required for non-root user in few OSs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh that's weird, not something I expected. I think in that case the user could just add /usr/sbin to their $PATH (whereas on some systems lsof is in /usr/bin not /usr/sbin so fixing that would require root).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this is a thing people are likely to run into I can add some code to check for lsof in different places under usr, but that seems brittle. Checking on one of the jenkins machines, my local laptop, and a random debian server I run lsof all works as a non-root user just fine. (Although searching on Google does indeed result in some folks who have had to either add /usr/sbin or other dir to their path as a non-root user).


rm $tmp_settings
cd ..
exit 0
fi

if [[ "$1" == "publish-release" ]]; then
SPARK_VERSION=$SPARK_PACKAGE_VERSION

cd spark
# Publish Spark to Maven release repo
echo "Publishing Spark checkout at '$GIT_REF' ($git_hash)"
Expand Down Expand Up @@ -384,7 +382,7 @@ if [[ "$1" == "publish-release" ]]; then
# -DskipTests $SCALA_2_12_PROFILES §$PUBLISH_PROFILES clean install

# Clean-up Zinc nailgun process
/usr/sbin/lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | xargs kill
lsof -P |grep $ZINC_PORT | grep LISTEN | awk '{ print $2; }' | xargs kill

#./dev/change-scala-version.sh 2.11

Expand Down