Skip to content

Use custom Maven version for all our products #1220

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 1, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,10 @@ All notable changes to this project will be documented in this file.
### Changed

- all: Use our build-repo to cache NPM dependencies ([#1219])
- java: Use a more recent Maven version for all Java based products ([#1220])

[#1219]: https://github.com/stackabletech/docker-images/pull/1219
[#1220]: https://github.com/stackabletech/docker-images/pull/1220

## [25.7.0] - 2025-07-23

Expand Down
2 changes: 0 additions & 2 deletions druid/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,6 @@ cp -r /stackable/patched-libs/maven/* /stackable/.m2/repository
tar -czf /stackable/druid-${NEW_VERSION}-src.tar.gz .

mvn \
--batch-mode \
--no-transfer-progress \
clean install \
-Pdist,stackable-bundle-contrib-exts \
-Dhadoop.compile.version=${HADOOP_VERSION}-stackable${RELEASE} \
Expand Down
2 changes: 0 additions & 2 deletions hadoop/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,6 @@ cp -r /stackable/patched-libs/maven/* /stackable/.m2/repository
tar -czf /stackable/hdfs-utils-${HDFS_UTILS}-src.tar.gz .

mvn \
--batch-mode \
--no-transfer-progress\
clean package \
-P hadoop-${HADOOP_VERSION} \
-Dhadoop.version=${HADOOP_VERSION}-stackable${RELEASE} \
Expand Down
2 changes: 0 additions & 2 deletions hadoop/hadoop/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -76,8 +76,6 @@ tar -czf /stackable/hadoop-${NEW_VERSION}-src.tar.gz .
# Therefore, this build does work but the final image does NOT contain the openssl-devel package which is why it fails there which is why we have to create the symlink over there manually.
# We still leave this flag in to automatically fail should anything with the packages or symlinks ever fail.
mvn \
--batch-mode \
--no-transfer-progress \
clean package install \
-Pdist,native \
-pl '!hadoop-tools/hadoop-pipes' \
Expand Down
2 changes: 0 additions & 2 deletions hbase/hbase-opa-authorizer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,6 @@ if [[ -n "$PRODUCT" ]]; then
# Create snapshot of the source code including custom patches
tar -czf /stackable/hbase-opa-authorizer-${PRODUCT}-src.tar.gz .
mvn \
--batch-mode \
--no-transfer-progress \
-DskipTests \
-Dmaven.test.skip=true \
package
Expand Down
2 changes: 0 additions & 2 deletions hbase/hbase-operator-tools/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,6 @@ mvn versions:set -DnewVersion=$NEW_VERSION
tar -czf /stackable/hbase-operator-tools-${FULL_HBASE_OPERATOR_TOOLS_VERSION}-src.tar.gz .

mvn \
--batch-mode \
--no-transfer-progress \
-Dhbase.version=${PATCHED_HBASE_VERSION} \
-Dhbase-thirdparty.version=${HBASE_THIRDPARTY} \
-DskipTests \
Expand Down
4 changes: 0 additions & 4 deletions hbase/hbase/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -54,16 +54,12 @@ tar -czf /stackable/hbase-${NEW_VERSION}-src.tar.gz .
# I chose to replicate that exact behavior for consistency so please don't merge the two mvn runs into one unless you really know what you're doing!
# Cannot skip building tests here because the assembly plugin needs a shell script from the test directory.
mvn \
--batch-mode \
--no-transfer-progress \
-Dhadoop.profile=3.0 \
-Dhadoop-three.version=${HADOOP_VERSION}-stackable${RELEASE} \
-DskipTests \
clean install

mvn \
--batch-mode \
--no-transfer-progress \
-Dhadoop.profile=3.0 \
-Dhadoop-three.version=${HADOOP_VERSION}-stackable${RELEASE} \
-DskipTests \
Expand Down
2 changes: 0 additions & 2 deletions hbase/phoenix/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,6 @@ tar -czf /stackable/phoenix-${PRODUCT}-stackable${RELEASE}-src.tar.gz .
# The Maven command can be found inside of the scripts in the create-release folder (release-util.sh as of Phoenix 5.2.0)
# https://github.com/apache/phoenix/tree/5.2.0/dev/create-release
mvn \
--batch-mode \
--no-transfer-progress \
-Dhbase.version=${HBASE_VERSION}-stackable${RELEASE} \
-Dhbase.profile=${HBASE_PROFILE} \
-Dhadoop.version=${HADOOP_VERSION}-stackable${RELEASE} \
Expand Down
10 changes: 8 additions & 2 deletions hive/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -56,13 +56,19 @@ mvn versions:set -DnewVersion=$NEW_VERSION -DartifactId=* -DgroupId=* -Dgenerate
tar -czf /stackable/hive-${NEW_VERSION}-src.tar.gz .

if [[ "${PRODUCT}" == "3.1.3" ]] ; then
mvn --batch-mode --no-transfer-progress clean package -DskipTests --projects standalone-metastore
mvn \
clean package \
-DskipTests \
--projects standalone-metastore
mv standalone-metastore/target/apache-hive-metastore-${NEW_VERSION}-bin/apache-hive-metastore-${NEW_VERSION}-bin /stackable
mv standalone-metastore/target/bom.json /stackable/apache-hive-metastore-${NEW_VERSION}-bin/apache-hive-metastore-${NEW_VERSION}.cdx.json
else
(
# https://issues.apache.org/jira/browse/HIVE-20451 switched the metastore server packaging starting with 4.0.0
mvn --batch-mode --no-transfer-progress clean package -DskipTests -Dhadoop.version=${HADOOP_VERSION}-stackable${RELEASE}
mvn \
clean package \
-DskipTests \
-Dhadoop.version=${HADOOP_VERSION}-stackable${RELEASE}

# We only seem to get a .tar.gz archive, so let's extract that to the correct location
tar --extract --directory=/stackable -f standalone-metastore/metastore-server/target/apache-hive-standalone-metastore-server-${NEW_VERSION}-bin.tar.gz
Expand Down
11 changes: 10 additions & 1 deletion java-devel/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@ FROM stackable/image/stackable-devel
ARG PRODUCT
ARG STACKABLE_USER_UID

# Find the latest version here: https://github.com/apache/maven
# renovate: datasource=github-tags packageName=apache/maven
ARG MAVEN_VERSION="3.9.10"

# See: https://adoptium.net/en-gb/installation/linux/#_centosrhelfedora_instructions
RUN cat <<EOF > /etc/yum.repos.d/adoptium.repo
[Adoptium]
Expand Down Expand Up @@ -46,7 +50,6 @@ microdnf install \
krb5-devel \
libcurl-devel \
make \
maven \
openssl-devel \
`# Required to unpack Omid tarball` \
tar \
Expand All @@ -56,9 +59,15 @@ microdnf install \
zlib-devel
microdnf clean all
rm -rf /var/cache/yum

curl "https://repo.stackable.tech/repository/packages/maven/apache-maven-${MAVEN_VERSION}-bin.tar.gz" | tar -xzC /tmp
mv /tmp/apache-maven-${MAVEN_VERSION} /opt/maven
ln -s /opt/maven/bin/mvn /usr/bin/mvn

EOF

ENV JAVA_HOME="/usr/lib/jvm/temurin-${PRODUCT}-jdk"
ENV MAVEN_ARGS="--batch-mode --no-transfer-progress"

COPY --chown=${STACKABLE_USER_UID}:0 java-devel/stackable/settings.xml /stackable/.m2/settings.xml
COPY --chown=${STACKABLE_USER_UID}:0 java-devel/stackable/settings.xml /root/.m2/settings.xml
Expand Down
31 changes: 10 additions & 21 deletions nifi/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ FROM stackable/image/java-devel AS nifi-builder

ARG PRODUCT
ARG RELEASE
ARG MAVEN_VERSION="3.9.8"
ARG STACKABLE_USER_UID

RUN <<EOF
Expand All @@ -18,20 +17,6 @@ microdnf clean all
rm -rf /var/cache/yum
EOF

# NOTE: From NiFi 2.0.0 upwards Apache Maven 3.9.6+ is required. As of 2024-07-04 the java-devel image
# ships 3.6.3. This will update maven accordingly depending on the version. The error is due to the maven-enforer-plugin.
#
# [ERROR] Rule 2: org.apache.maven.enforcer.rules.version.RequireMavenVersion failed with message:
# [ERROR] Detected Maven Version: 3.6.3 is not in the allowed range [3.9.6,).
#
RUN <<EOF
if [[ "${PRODUCT}" != 1.* ]] ; then
cd /tmp
curl "https://repo.stackable.tech/repository/packages/maven/apache-maven-${MAVEN_VERSION}-bin.tar.gz" | tar -xzC .
ln -sf /tmp/apache-maven-${MAVEN_VERSION}/bin/mvn /usr/bin/mvn
fi
EOF

USER ${STACKABLE_USER_UID}
WORKDIR /stackable

Expand All @@ -57,9 +42,17 @@ tar -czf /stackable/nifi-${NEW_VERSION}-src.tar.gz .
# NOTE: Since NiFi 2.0.0 PutIceberg Processor and services were removed, so including the `include-iceberg` profile does nothing.
# Additionally some modules were moved to optional build profiles, so we need to add `include-hadoop` to get `nifi-parquet-nar` for example.
if [[ "${PRODUCT}" != 1.* ]] ; then
mvn --batch-mode --no-transfer-progress clean install -Dmaven.javadoc.skip=true -DskipTests --activate-profiles include-hadoop,include-hadoop-aws,include-hadoop-azure,include-hadoop-gcp
mvn \
clean install \
-Dmaven.javadoc.skip=true \
-DskipTests \
--activate-profiles include-hadoop,include-hadoop-aws,include-hadoop-azure,include-hadoop-gcp
else
mvn --batch-mode --no-transfer-progress clean install -Dmaven.javadoc.skip=true -DskipTests --activate-profiles include-iceberg,include-hadoop-aws,include-hadoop-azure,include-hadoop-gcp
mvn \
clean install \
-Dmaven.javadoc.skip=true \
-DskipTests \
--activate-profiles include-iceberg,include-hadoop-aws,include-hadoop-azure,include-hadoop-gcp
fi

# Copy the binaries to the /stackable folder
Expand Down Expand Up @@ -118,8 +111,6 @@ if [[ "${PRODUCT}" != 1.* ]] ; then
sed -i -e "s/{{ NIFI_VERSION }}/${PRODUCT}/g" pom.xml

mvn \
--batch-mode \
--no-transfer-progress\
clean package \
-D nifi.version=${PRODUCT} \
-Dmaven.javadoc.skip=true \
Expand Down Expand Up @@ -161,8 +152,6 @@ cd "$(/stackable/patchable --images-repo-root=src checkout nifi/opa-plugin ${NIF
tar -czf /stackable/nifi-opa-plugin-${NIFI_OPA_AUTHORIZER_PLUGIN}-src.tar.gz .

mvn \
--batch-mode \
--no-transfer-progress \
clean package \
-DskipTests \
-Pnifi-${PRODUCT}
Expand Down
2 changes: 1 addition & 1 deletion omid/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ RUN --mount=type=cache,id=maven-omid-${PRODUCT},uid=${STACKABLE_USER_UID},target

# Create snapshot of the source code including custom patches
tar -czf /stackable/omid-${NEW_VERSION}-src.tar.gz .
mvn --batch-mode --no-transfer-progress package -Phbase-2 -DskipTests
mvn package -Phbase-2 -DskipTests
tar -xf tso-server/target/omid-tso-server-${NEW_VERSION}-bin.tar.gz -C /stackable
sed -i "s/${NEW_VERSION}/${ORIGINAL_VERSION}/g" tso-server/target/bom.json
mv tso-server/target/bom.json /stackable/omid-tso-server-${NEW_VERSION}/omid-tso-server-${NEW_VERSION}.cdx.json
Expand Down
25 changes: 5 additions & 20 deletions spark-k8s/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -81,14 +81,14 @@ export JDK_JAVA_OPTIONS="\

# Get the Scala version used by Spark
SCALA_VERSION=$( \
mvn --quiet --non-recursive --no-transfer-progress --batch-mode --file /stackable/spark/pom.xml \
mvn --quiet --non-recursive --file /stackable/spark/pom.xml \
org.apache.maven.plugins:maven-help-plugin:3.5.0:evaluate \
-DforceStdout \
-Dexpression='project.properties(scala.version)')

# Get the Scala binary version used by Spark
SCALA_BINARY_VERSION=$( \
mvn --quiet --non-recursive --no-transfer-progress --batch-mode --file /stackable/spark/pom.xml \
mvn --quiet --non-recursive --file /stackable/spark/pom.xml \
org.apache.maven.plugins:maven-help-plugin:3.5.0:evaluate \
-DforceStdout \
-Dexpression='project.properties(scala.binary.version)')
Expand All @@ -102,8 +102,6 @@ SCALA_BINARY_VERSION=$( \
# at org.apache.hadoop.hbase.LocalHBaseCluster.startup(LocalHBaseCluster.java:407)
# at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:250)
mvn \
--batch-mode \
--no-transfer-progress \
--define spark.version="${PRODUCT}" \
--define scala.version="${SCALA_VERSION}" \
--define scala.binary.version="${SCALA_BINARY_VERSION}" \
Expand All @@ -122,7 +120,7 @@ cd /stackable/spark/jars
# Spark contains only log4j-slf4j2-impl-x.x.x.jar but not
# log4j-slf4j-impl-x.x.x.jar. It is okay to have both JARs in the
# classpath as long as they have the same version.
mvn --quiet --non-recursive --no-transfer-progress --batch-mode --file /stackable/spark/pom.xml \
mvn --non-recursive --file /stackable/spark/pom.xml \
dependency:copy \
-Dartifact=org.apache.logging.log4j:log4j-slf4j-impl:'${log4j.version}' \
-DoutputDirectory=./jars
Expand Down Expand Up @@ -150,9 +148,6 @@ ARG TARGETARCH
ARG TINI
ARG RELEASE
ARG STACKABLE_USER_UID
# Find the latest version here: https://github.com/apache/maven
# renovate: datasource=github-tags packageName=apache/maven
ARG MAVEN_VERSION="3.9.10"

WORKDIR /stackable/spark-${PRODUCT}-stackable${RELEASE}

Expand All @@ -169,25 +164,15 @@ RUN <<EOF
# Make Maven aware of custom Stackable libraries
mv /stackable/patched-libs/maven /root/.m2/repository

# We download the Maven binary from our own repository because:
#
# 1. The UBI Maven version is too old:
# 134.0 [ERROR] Detected Maven Version: 3.6.3 is not in the allowed range [3.8.8,)
# 2. The Maven download from archive.apache.org is not working reliably:
curl "https://repo.stackable.tech/repository/packages/maven/apache-maven-${MAVEN_VERSION}-bin.tar.gz" | tar -xzC /tmp

ORIGINAL_VERSION="${PRODUCT}"
NEW_VERSION="${PRODUCT}-stackable${RELEASE}"

export MAVEN_OPTS="-Xss64m -Xmx2g -XX:ReservedCodeCacheSize=1g"

./dev/make-distribution.sh \
--mvn /tmp/apache-maven-${MAVEN_VERSION}/bin/mvn \
-Dhadoop.version="${HADOOP_VERSION}-stackable${RELEASE}" \
-DskipTests \
-P'hadoop-3' -Pkubernetes -Phive -Phive-thriftserver \
--no-transfer-progress \
--batch-mode
-P'hadoop-3' -Pkubernetes -Phive -Phive-thriftserver

sed -i "s/${NEW_VERSION}/${ORIGINAL_VERSION}/g" assembly/target/bom.json
EOF
Expand All @@ -200,7 +185,7 @@ RUN <<EOF

# Get the Scala binary version
SCALA_BINARY_VERSION=$( \
mvn --quiet --non-recursive --no-transfer-progress --batch-mode --file pom.xml \
mvn --quiet --non-recursive --file pom.xml \
org.apache.maven.plugins:maven-help-plugin:3.5.0:evaluate \
-DforceStdout \
-Dexpression='project.properties(scala.binary.version)')
Expand Down
5 changes: 1 addition & 4 deletions trino/storage-connector/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,7 @@ tar -czf /stackable/trino-storage-connector-${NEW_VERSION}-src.tar.gz .

mvn versions:set -DnewVersion=${NEW_VERSION}

# We need to use ./mvnw instead of mvn to get a recent maven version (which is required to build Trino)
./mvnw \
--batch-mode \
--no-transfer-progress \
mvn \
package \
-DskipTests `# Skip test execution` \
-Ddep.trino.version=${NEW_VERSION} `# Use custom Stackable Trino version in tests` \
Expand Down
5 changes: 1 addition & 4 deletions trino/trino/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -42,10 +42,7 @@ if [ "$PRODUCT" = "470" ] || [ "$PRODUCT" = "451" ]; then
SKIP_PROJECTS="$SKIP_PROJECTS,!core/trino-server-rpm"
fi

# We need to use ./mvnw instead of mvn to get a recent maven version (which is required to build Trino)
./mvnw \
--batch-mode \
--no-transfer-progress \
mvn \
install \
`# -Dmaven.test.skip # Unable to skip test compilation without an unused dependency error for software.amazon.awssdk:identity-spi` \
-DskipTests `# Skip test execution` \
Expand Down
6 changes: 5 additions & 1 deletion zookeeper/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,11 @@ tar -czf /stackable/zookeeper-${NEW_VERSION}-src.tar.gz .

# Exclude the `zookeeper-client` submodule, this is not needed and has c parts
# that created all kinds of issues for the build container
mvn --batch-mode --no-transfer-progress -pl "!zookeeper-client/zookeeper-client-c" clean install checkstyle:check spotbugs:check -DskipTests -Pfull-build
mvn \
-pl "!zookeeper-client/zookeeper-client-c" \
clean install checkstyle:check spotbugs:check \
-DskipTests \
-Pfull-build

# Unpack the archive which contains the build artifacts from above. Remove some
# unused files to shrink the final image size.
Expand Down