Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 43 additions & 0 deletions .github/workflows/build_3.3.3.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#

name: "Build and Test (3.3.3)"

on:
pull_request:
branches:
- 'master'
paths:
- '3.3.3/**'
- '.github/workflows/build_3.3.3.yaml'
- '.github/workflows/main.yml'

jobs:
run-build:
strategy:
matrix:
image-type: ["all", "python", "scala", "r"]
name: Run
secrets: inherit
uses: ./.github/workflows/main.yml
with:
spark: 3.3.3
scala: 2.12
java: 11
image-type: ${{ matrix.image-type }}
1 change: 1 addition & 0 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ on:
options:
- 3.4.1
- 3.4.0
- 3.3.3
- 3.3.2
- 3.3.1
- 3.3.0
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ on:
options:
- 3.4.1
- 3.4.0
- 3.3.3
- 3.3.2
- 3.3.1
- 3.3.0
Expand Down
29 changes: 29 additions & 0 deletions 3.3.3/scala2.12-java11-python3-r-ubuntu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
FROM spark:3.3.3-scala2.12-java11-ubuntu

USER root

RUN set -ex; \
apt-get update; \
apt-get install -y python3 python3-pip; \
apt-get install -y r-base r-base-dev; \
rm -rf /var/lib/apt/lists/*

ENV R_HOME /usr/lib/R

USER spark
26 changes: 26 additions & 0 deletions 3.3.3/scala2.12-java11-python3-ubuntu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
FROM spark:3.3.3-scala2.12-java11-ubuntu

USER root

RUN set -ex; \
apt-get update; \
apt-get install -y python3 python3-pip; \
rm -rf /var/lib/apt/lists/*

USER spark
28 changes: 28 additions & 0 deletions 3.3.3/scala2.12-java11-r-ubuntu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
FROM spark:3.3.3-scala2.12-java11-ubuntu

USER root

RUN set -ex; \
apt-get update; \
apt-get install -y r-base r-base-dev; \
rm -rf /var/lib/apt/lists/*

ENV R_HOME /usr/lib/R

USER spark
79 changes: 79 additions & 0 deletions 3.3.3/scala2.12-java11-ubuntu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
FROM eclipse-temurin:11-jre-focal

ARG spark_uid=185

RUN groupadd --system --gid=${spark_uid} spark && \
useradd --system --uid=${spark_uid} --gid=spark spark

RUN set -ex; \
apt-get update; \
apt-get install -y gnupg2 wget bash tini libc6 libpam-modules krb5-user libnss3 procps net-tools gosu libnss-wrapper; \
mkdir -p /opt/spark; \
mkdir /opt/spark/python; \
mkdir -p /opt/spark/examples; \
mkdir -p /opt/spark/work-dir; \
chmod g+w /opt/spark/work-dir; \
touch /opt/spark/RELEASE; \
chown -R spark:spark /opt/spark; \
echo "auth required pam_wheel.so use_uid" >> /etc/pam.d/su; \
rm -rf /var/lib/apt/lists/*

# Install Apache Spark
# https://downloads.apache.org/spark/KEYS
ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz \
SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-3.3.3/spark-3.3.3-bin-hadoop3.tgz.asc \
GPG_KEY=F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8

RUN set -ex; \
export SPARK_TMP="$(mktemp -d)"; \
cd $SPARK_TMP; \
wget -nv -O spark.tgz "$SPARK_TGZ_URL"; \
wget -nv -O spark.tgz.asc "$SPARK_TGZ_ASC_URL"; \
export GNUPGHOME="$(mktemp -d)"; \
gpg --batch --keyserver hkps://keys.openpgp.org --recv-key "$GPG_KEY" || \
gpg --batch --keyserver hkps://keyserver.ubuntu.com --recv-keys "$GPG_KEY"; \
gpg --batch --verify spark.tgz.asc spark.tgz; \
gpgconf --kill all; \
rm -rf "$GNUPGHOME" spark.tgz.asc; \
\
tar -xf spark.tgz --strip-components=1; \
chown -R spark:spark .; \
mv jars /opt/spark/; \
mv bin /opt/spark/; \
mv sbin /opt/spark/; \
mv kubernetes/dockerfiles/spark/decom.sh /opt/; \
mv examples /opt/spark/; \
mv kubernetes/tests /opt/spark/; \
mv data /opt/spark/; \
mv python/pyspark /opt/spark/python/pyspark/; \
mv python/lib /opt/spark/python/lib/; \
mv R /opt/spark/; \
chmod a+x /opt/decom.sh; \
cd ..; \
rm -rf "$SPARK_TMP";

COPY entrypoint.sh /opt/

ENV SPARK_HOME /opt/spark

WORKDIR /opt/spark/work-dir

USER spark

ENTRYPOINT [ "/opt/entrypoint.sh" ]
126 changes: 126 additions & 0 deletions 3.3.3/scala2.12-java11-ubuntu/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
#!/bin/bash
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Prevent any errors from being silently ignored
set -eo pipefail

attempt_setup_fake_passwd_entry() {
# Check whether there is a passwd entry for the container UID
local myuid; myuid="$(id -u)"
# If there is no passwd entry for the container UID, attempt to fake one
# You can also refer to the https://github.com/docker-library/official-images/pull/13089#issuecomment-1534706523
# It's to resolve OpenShift random UID case.
# See also: https://github.com/docker-library/postgres/pull/448
if ! getent passwd "$myuid" &> /dev/null; then
local wrapper
for wrapper in {/usr,}/lib{/*,}/libnss_wrapper.so; do
if [ -s "$wrapper" ]; then
NSS_WRAPPER_PASSWD="$(mktemp)"
NSS_WRAPPER_GROUP="$(mktemp)"
export LD_PRELOAD="$wrapper" NSS_WRAPPER_PASSWD NSS_WRAPPER_GROUP
local mygid; mygid="$(id -g)"
printf 'spark:x:%s:%s:${SPARK_USER_NAME:-anonymous uid}:%s:/bin/false\n' "$myuid" "$mygid" "$SPARK_HOME" > "$NSS_WRAPPER_PASSWD"
printf 'spark:x:%s:\n' "$mygid" > "$NSS_WRAPPER_GROUP"
break
fi
done
fi
}

if [ -z "$JAVA_HOME" ]; then
JAVA_HOME=$(java -XshowSettings:properties -version 2>&1 > /dev/null | grep 'java.home' | awk '{print $3}')
fi

SPARK_CLASSPATH="$SPARK_CLASSPATH:${SPARK_HOME}/jars/*"
for v in "${!SPARK_JAVA_OPT_@}"; do
SPARK_EXECUTOR_JAVA_OPTS+=( "${!v}" )
done

if [ -n "$SPARK_EXTRA_CLASSPATH" ]; then
SPARK_CLASSPATH="$SPARK_CLASSPATH:$SPARK_EXTRA_CLASSPATH"
fi

if ! [ -z "${PYSPARK_PYTHON+x}" ]; then
export PYSPARK_PYTHON
fi
if ! [ -z "${PYSPARK_DRIVER_PYTHON+x}" ]; then
export PYSPARK_DRIVER_PYTHON
fi

# If HADOOP_HOME is set and SPARK_DIST_CLASSPATH is not set, set it here so Hadoop jars are available to the executor.
# It does not set SPARK_DIST_CLASSPATH if already set, to avoid overriding customizations of this value from elsewhere e.g. Docker/K8s.
if [ -n "${HADOOP_HOME}" ] && [ -z "${SPARK_DIST_CLASSPATH}" ]; then
export SPARK_DIST_CLASSPATH="$($HADOOP_HOME/bin/hadoop classpath)"
fi

if ! [ -z "${HADOOP_CONF_DIR+x}" ]; then
SPARK_CLASSPATH="$HADOOP_CONF_DIR:$SPARK_CLASSPATH";
fi

if ! [ -z "${SPARK_CONF_DIR+x}" ]; then
SPARK_CLASSPATH="$SPARK_CONF_DIR:$SPARK_CLASSPATH";
elif ! [ -z "${SPARK_HOME+x}" ]; then
SPARK_CLASSPATH="$SPARK_HOME/conf:$SPARK_CLASSPATH";
fi

# Switch to spark if no USER specified (root by default) otherwise use USER directly
switch_spark_if_root() {
if [ $(id -u) -eq 0 ]; then
echo gosu spark
fi
}

case "$1" in
driver)
shift 1
CMD=(
"$SPARK_HOME/bin/spark-submit"
--conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS"
--deploy-mode client
"$@"
)
attempt_setup_fake_passwd_entry
# Execute the container CMD under tini for better hygiene
exec $(switch_spark_if_root) /usr/bin/tini -s -- "${CMD[@]}"
;;
executor)
shift 1
CMD=(
${JAVA_HOME}/bin/java
"${SPARK_EXECUTOR_JAVA_OPTS[@]}"
-Xms"$SPARK_EXECUTOR_MEMORY"
-Xmx"$SPARK_EXECUTOR_MEMORY"
-cp "$SPARK_CLASSPATH:$SPARK_DIST_CLASSPATH"
org.apache.spark.scheduler.cluster.k8s.KubernetesExecutorBackend
--driver-url "$SPARK_DRIVER_URL"
--executor-id "$SPARK_EXECUTOR_ID"
--cores "$SPARK_EXECUTOR_CORES"
--app-id "$SPARK_APPLICATION_ID"
--hostname "$SPARK_EXECUTOR_POD_IP"
--resourceProfileId "$SPARK_RESOURCE_PROFILE_ID"
--podName "$SPARK_EXECUTOR_POD_NAME"
)
attempt_setup_fake_passwd_entry
# Execute the container CMD under tini for better hygiene
exec $(switch_spark_if_root) /usr/bin/tini -s -- "${CMD[@]}"
;;

*)
# Non-spark-on-k8s command provided, proceeding in pass-through mode...
exec "$@"
;;
esac
2 changes: 2 additions & 0 deletions tools/template.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,8 @@
"3.3.1": "86727D43E73A415F67A0B1A14E68B3E6CD473653",
# issuer "[email protected]"
"3.3.2": "C56349D886F2B01F8CAE794C653C2301FEA493EE",
# issuer "[email protected]"
"3.3.3": "F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems wrong and failed to check public key:

13.28 + gpg --keyserver hkps://keys.openpgp.org --recv-key F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8
13.29 gpg: keybox '/tmp/tmp.yZmBm6Lueo/pubring.kbx' created
14.21 gpg: key A077F928A0BF68D8: new key but contains no user ID - skipped
14.21 gpg: Total number processed: 1
14.21 gpg:           w/o user IDs: 1
14.21 + gpg --batch --verify spark.tgz.asc spark.tgz
15.90 gpg: Signature made Fri 04 Aug 2023 04:18:57 PM UTC
15.90 gpg:                using RSA key F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8
15.90 gpg:                issuer "[email protected]"
15.90 gpg: Can't check signature: No public key

Copy link
Member Author

@wangyum wangyum Aug 22, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that the key should be uploaded via: https://keys.openpgp.org/upload.

pub   rsa4096 2023-08-04 [SC]
      F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8
uid                      Yuming Wang (CODE SIGNING KEY) <[email protected]>
sub   rsa4096 2023-08-04 [E]
-----BEGIN PGP PUBLIC KEY BLOCK-----

mQINBGTMm/ABEACejvAu4Dc74oIy3VGdz4g/axvoIEyj1q7bQYW+vJhdiWY2/yK/
U+s10es2WFHL6Vq3wjdhiVwlPWLBICkxPTIGcSBan6Qlqwhepnv27YJpTy/B/C6/
wgQC//9GkifOyxOIxKU+5WNMyGZOSXBcQvHkwcxCrVmt/Hz0mNG/jKkXc8EpU/hI
dY83G6tJMgar54N+24H0NsY7pfU/9NOno6Klnn3IzOw/tLM+V5GS9un4u6D/XG15
bvT9+c234CzV2q5Zo+sDrqTE7qHfY5aqQU9WaEFCLw/n+lu39Hpgi843mw+IgNMs
hltxOvg6vbTY50LGpqcUJSXQOka5jLNVQVv3k9i9DndkVO9AU98FnMP3sK3ZFmlH
YA/4iMmZ0+41KAgadqLfqn7d19uqLZgAu88RHVUeDOZA2LoXLzjn7TrGq9kD3WRZ
st1cDV9ESxzr9X3vvhuz3JNGKUQzhaq0Mw4HWwbUWVVT3MCxI+wDNBKaN8R76PfV
ISX50y146OJ1BwHq5C6i9IEyxpTTih67z8UpNOnwL38ekwxmgJAklsprt32HejsP
8ac3zmufEDyn2zJ+7/wWHei9eW2ZxUXMbY30xFRnYAAYgaE7HY5kQqwk7P9obvMb
hSTtUVTJ9+4SgaEBABgjbziQpFB/WxAF280EnIIqX0TibuS8LC9//w7SYQARAQAB
tDNZdW1pbmcgV2FuZyAoQ09ERSBTSUdOSU5HIEtFWSkgPHl1bXdhbmdAYXBhY2hl
Lm9yZz6JAlEEEwEIADsWIQT2RopP+Dd7TxwHvCqgd/kooL9o2AUCZMyb8AIbAwUL
CQgHAgIiAgYVCgkICwIEFgIDAQIeBwIXgAAKCRCgd/kooL9o2JfOD/9ZQRH3KgSB
mEYadvVfn0NzSJ9AwJBXvOHfO5IPEZ4T/R1+UGSPxJqWsnCNAGR/NU1YHgbj/31k
CB/Taep8S7PoQaj3c8/84DsPRRUgYuLILDYTo6wrgvk4g1rHy6u49+wiUY3gCwV/
uUpggfTPeqya6HkNOgNWXXUoLWKmpWqJYoAUFUBvNr+fdyac21QdbK7zSCyVZU70
Ov2YXya82f0NrCiUbOAfmQ9oEKVb0PDoOvzd0Db6cJ/7bii849/1/gWMZQDWmRnm
rYcD+JxKgMgOMp5SZgmu5AD7htbsF+D34D6snwkWXDzfMYqVsvsL4NE1ENm7qSLG
U6V9PAsRFCVKUg8xAI3vR5HX3Pvp6e4kz3zm56GMY9knHQFzaw7HdNaPQnujLG4Q
NqSxHXBOsJ01AJXj6eZTHANA/j5cjqrBISsxuvEH7HNgWO7/b2V17UsjUUIWG5SZ
IgtgqfEPonDh1EXwerXHr9D0uKGjYUlwnUQLlSa4f1yY15GinEvPkmQOTu8ddZaU
Sl/5vxmiYbzC2iIPJMHcwxb0Df1Ynp7GKhP/LOHNH+ROB97q2be+ZgSZKgqHBops
zBN93/FhjAEDwzVuMmySyofp5KmElg/Tox0xAgxyV3zWsUji6hon3KoAqLocoHJv
IIbLn4koHhM4ewy5BN9yeGH4C5Xc9NC0ObkCDQRkzJvwARAA4BxnWU48R+wSTE4u
9oXpXOAYwriOxt3al17NrcbH7j73y4U+BXbN2P2xGYjyz6PoETAtAP6EKokSn4ei
3jvSwVdy+XQa7T5vLiCTzNjEE9paTHLasQ10dS9+j1ow4WIV9L8NRCVQWBYlFqCd
6l/O0d1OFJhXb/8BfrEzVBPBR36Ee32ZrDXhJ06h1u7nPJVc4L0A5US5RJX2RXEB
UhbXEJfhOaSRANujL6h70DLp6GFkevlFH/CFwmrNvML8AF7sUFkcNdSP8HVIRbU2
KHclRSetkFvvIALXTcVG9PPz0/TIucwpkMFtzhi4U8r0Z4krNHjgs5lhjxrrI2j0
aBGywpAMG1nTe4rIOUio4p7ze+Be0pmbx2Zl0HWxbRmtNvFKTpGHpP0E3IsnxxpZ
q6U8vOGPvVxWIZVdOfUHMR2FCHsRoi6e8Vqw0bvhc7+7+cw9t5A2dXejG/j9jQ++
F2lwRFVlzxKbLooKIQecUaBn2O+Mxo/RlKqHFjmRE+VEQiQ6jRtFESfj5fpYgeuy
XuMGTDdG4dlOXZx/GaEqbpNilY2CK9Zal6JpVFKT9GlzeITLlRURMa/3589mI4mZ
VcmVa6AiM3B2lfljaZJGrTFntl+2hUILJsQJ+t7iH04+im00dpVFwXHA5ji+yM20
4C57hNFSEEE+QFaGF5FTq5fHEekAEQEAAYkCNgQYAQgAIBYhBPZGik/4N3tPHAe8
KqB3+Sigv2jYBQJkzJvwAhsMAAoJEKB3+Sigv2jYjnQP/i1T/JFdoUcyNoFWqisy
I/EbOPwMCsBJpDttd7JiW7yWKWdawdjDjQym5FKY5hResj7nox9fObUTNVzOBHsV
PWbIypP1i4uBQrjrg/klOhV2ltoq9bBMgaijzDcmDVf5kehThBfiwNwrfMpHgPhl
raU+fmF97vhfc0NDJKIkiq57wcQyU5RMkzitiZKg2i6sZwS3AZpCuGoScyczDaGV
HvIcJHG1OrXf6q+sQ13k5BU6Q7pqvewyJbv8SI8gnAWb+tlUQGIcP2/9B0D6EAOq
hFiM0rqIrY67lKluly+1/QEPTE8wcnSjxm0k9bGHeD+p0mBxILvXNhi1q03ROhzf
7L74MSjtwbZUoeQJ/lzMLFVcNTq+k4IAXs2fLnMu2Qsuue9HHq0W4xLcHYm9CqoV
17y2VGtK1XxZd4JIYQP/ZuRt0r6zOhRBOa8aThcg6Y6p8M18sRI/rWeadsmbJ3Sf
qmgGvDUTnmYv0yi+jNiTBNn002sxRZUz543xINd/j4J+iBlX+mn5IPA7b94MRL06
rCe7HD9pJgGh1/hsJnn20rpL7rRKQlkG2cuOf6V475ws+F16+DbjnOUbMF0JIKqw
gDOpXbrcjWwiB29RAh83kbtM6QvXAnyMhEAjFPC/Sz4znXMNLoRcKUtMGx7mELWo
a+TeRisx2Fp5t+/vM5rHYFdf
=CUri
-----END PGP PUBLIC KEY BLOCK-----

After uploading the key. We will receive a verification email:
image

And then. We can import the key:

yumwang@G9L07H60PK ~ % gpg --keyserver hkps://keys.openpgp.org --recv-key F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8 
gpg: key A077F928A0BF68D8: "Yuming Wang (CODE SIGNING KEY) <[email protected]>" not changed
gpg: Total number processed: 1
gpg:              unchanged: 1

yumwang@G9L07H60PK ~ % gpg --batch --verify  spark-3.3.3-bin-hadoop3.tgz.asc spark-3.3.3-bin-hadoop3.tgz                                                       
gpg: Signature made 六  8/ 5 00:18:57 2023 CST
gpg:                using RSA key F6468A4FF8377B4F1C07BC2AA077F928A0BF68D8
gpg:                issuer "[email protected]"
gpg: Good signature from "Yuming Wang (CODE SIGNING KEY) <[email protected]>" [unknown]
gpg: WARNING: The key's User ID is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: F646 8A4F F837 7B4F 1C07  BC2A A077 F928 A0BF 68D8

# issuer "[email protected]"
"3.4.0": "CC68B3D16FE33A766705160BA7E57908C7A4E1B1",
# issuer "[email protected]"
Expand Down
28 changes: 28 additions & 0 deletions versions.json
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,34 @@
"3.4.0-scala2.12-java11-python3-r-ubuntu"
]
},
{
"path": "3.3.3/scala2.12-java11-python3-ubuntu",
"tags": [
"3.3.3-scala2.12-java11-python3-ubuntu",
"3.3.3-python3",
"3.3.3"
]
},
{
"path": "3.3.3/scala2.12-java11-r-ubuntu",
"tags": [
"3.3.3-scala2.12-java11-r-ubuntu",
"3.3.3-r"
]
},
{
"path": "3.3.3/scala2.12-java11-ubuntu",
"tags": [
"3.3.3-scala2.12-java11-ubuntu",
"3.3.3-scala"
]
},
{
"path": "3.3.3/scala2.12-java11-python3-r-ubuntu",
"tags": [
"3.3.3-scala2.12-java11-python3-r-ubuntu"
]
},
{
"path": "3.3.1/scala2.12-java11-python3-ubuntu",
"tags": [
Expand Down