Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DNM] Bump Spark to 3.5.4 #8597

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 10 additions & 10 deletions .github/workflows/util/install_spark_resources.sh
Original file line number Diff line number Diff line change
Expand Up @@ -63,26 +63,26 @@ case "$1" in
3.5)
# Spark-3.5
cd ${INSTALL_DIR} && \
wget -nv https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz && \
tar --strip-components=1 -xf spark-3.5.2-bin-hadoop3.tgz spark-3.5.2-bin-hadoop3/jars/ && \
rm -rf spark-3.5.2-bin-hadoop3.tgz && \
wget -nv https://archive.apache.org/dist/spark/spark-3.5.4/spark-3.5.4-bin-hadoop3.tgz && \
tar --strip-components=1 -xf spark-3.5.4-bin-hadoop3.tgz spark-3.5.4-bin-hadoop3/jars/ && \
rm -rf spark-3.5.4-bin-hadoop3.tgz && \
mkdir -p ${INSTALL_DIR}/shims/spark35/spark_home/assembly/target/scala-2.12 && \
mv jars ${INSTALL_DIR}/shims/spark35/spark_home/assembly/target/scala-2.12 && \
wget -nv https://github.com/apache/spark/archive/refs/tags/v3.5.2.tar.gz && \
tar --strip-components=1 -xf v3.5.2.tar.gz spark-3.5.2/sql/core/src/test/resources/ && \
wget -nv https://github.com/apache/spark/archive/refs/tags/v3.5.4.tar.gz && \
tar --strip-components=1 -xf v3.5.4.tar.gz spark-3.5.4/sql/core/src/test/resources/ && \
mkdir -p shims/spark35/spark_home/ && \
mv sql shims/spark35/spark_home/
;;
3.5-scala2.13)
# Spark-3.5, scala 2.13
cd ${INSTALL_DIR} && \
wget -nv https://archive.apache.org/dist/spark/spark-3.5.2/spark-3.5.2-bin-hadoop3.tgz && \
tar --strip-components=1 -xf spark-3.5.2-bin-hadoop3.tgz spark-3.5.2-bin-hadoop3/jars/ && \
rm -rf spark-3.5.2-bin-hadoop3.tgz && \
wget -nv https://archive.apache.org/dist/spark/spark-3.5.4/spark-3.5.4-bin-hadoop3.tgz && \
tar --strip-components=1 -xf spark-3.5.4-bin-hadoop3.tgz spark-3.5.4-bin-hadoop3/jars/ && \
rm -rf spark-3.5.4-bin-hadoop3.tgz && \
mkdir -p ${INSTALL_DIR}/shims/spark35-scala2.13/spark_home/assembly/target/scala-2.13 && \
mv jars ${INSTALL_DIR}/shims/spark35-scala2.13/spark_home/assembly/target/scala-2.13 && \
wget -nv https://github.com/apache/spark/archive/refs/tags/v3.5.2.tar.gz && \
tar --strip-components=1 -xf v3.5.2.tar.gz spark-3.5.2/sql/core/src/test/resources/ && \
wget -nv https://github.com/apache/spark/archive/refs/tags/v3.5.4.tar.gz && \
tar --strip-components=1 -xf v3.5.4.tar.gz spark-3.5.4/sql/core/src/test/resources/ && \
mkdir -p shims/spark35-scala2.13/spark_home/ && \
mv sql shims/spark35-scala2.13/spark_home/
;;
Expand Down
22 changes: 11 additions & 11 deletions .github/workflows/velox_backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -855,9 +855,9 @@ jobs:
dnf module -y install python39 && \
alternatives --set python3 /usr/bin/python3.9 && \
pip3 install setuptools && \
pip3 install pyspark==3.5.2 cython && \
pip3 install pyspark==3.5.4 cython && \
pip3 install pandas pyarrow
- name: Build and Run unit test for Spark 3.5.2 (other tests)
- name: Build and Run unit test for Spark 3.5.4 (other tests)
run: |
cd $GITHUB_WORKSPACE/
export SPARK_SCALA_VERSION=2.12
Expand Down Expand Up @@ -898,9 +898,9 @@ jobs:
dnf module -y install python39 && \
alternatives --set python3 /usr/bin/python3.9 && \
pip3 install setuptools && \
pip3 install pyspark==3.5.2 cython && \
pip3 install pyspark==3.5.4 cython && \
pip3 install pandas pyarrow
- name: Build and Run unit test for Spark 3.5.2 with scala-2.13 (other tests)
- name: Build and Run unit test for Spark 3.5.4 with scala-2.13 (other tests)
run: |
cd $GITHUB_WORKSPACE/
export SPARK_SCALA_VERSION=2.13
Expand Down Expand Up @@ -930,7 +930,7 @@ jobs:
with:
name: arrow-jars-centos-7-${{github.sha}}
path: /root/.m2/repository/org/apache/arrow/
- name: Build and Run unit test for Spark 3.5.2 (slow tests)
- name: Build and Run unit test for Spark 3.5.4 (slow tests)
run: |
cd $GITHUB_WORKSPACE/
$MVN_CMD clean test -Pspark-3.5 -Pbackends-velox -Pceleborn -Piceberg -Pdelta -Phudi -Pspark-ut \
Expand Down Expand Up @@ -964,9 +964,9 @@ jobs:
dnf module -y install python39 && \
alternatives --set python3 /usr/bin/python3.9 && \
pip3 install setuptools && \
pip3 install pyspark==3.5.2 cython && \
pip3 install pyspark==3.5.4 cython && \
pip3 install pandas pyarrow
- name: Build and Run unit test for Spark 3.5.2 (other tests)
- name: Build and Run unit test for Spark 3.5.4 (other tests)
run: |
cd $GITHUB_WORKSPACE/
export SPARK_SCALA_VERSION=2.12
Expand Down Expand Up @@ -995,7 +995,7 @@ jobs:
with:
name: arrow-jars-centos-7-${{github.sha}}
path: /root/.m2/repository/org/apache/arrow/
- name: Build and Run unit test for Spark 3.5.2 (slow tests)
- name: Build and Run unit test for Spark 3.5.4 (slow tests)
run: |
cd $GITHUB_WORKSPACE/
$MVN_CMD clean test -Pspark-3.5 -Pbackends-velox -Pceleborn -Piceberg -Pdelta -Pspark-ut \
Expand Down Expand Up @@ -1028,9 +1028,9 @@ jobs:
dnf module -y install python39 && \
alternatives --set python3 /usr/bin/python3.9 && \
pip3 install setuptools && \
pip3 install pyspark==3.5.2 cython && \
pip3 install pyspark==3.5.4 cython && \
pip3 install pandas pyarrow
- name: Build and Run unit test for Spark 3.5.2 (other tests)
- name: Build and Run unit test for Spark 3.5.4 (other tests)
run: |
cd $GITHUB_WORKSPACE/
export SPARK_SCALA_VERSION=2.12
Expand Down Expand Up @@ -1059,7 +1059,7 @@ jobs:
with:
name: arrow-jars-centos-7-${{github.sha}}
path: /root/.m2/repository/org/apache/arrow/
- name: Build and Run unit test for Spark 3.5.2 (slow tests)
- name: Build and Run unit test for Spark 3.5.4 (slow tests)
run: |
cd $GITHUB_WORKSPACE/
$MVN_CMD clean test -Pspark-3.5 -Pbackends-velox -Pceleborn -Piceberg -Pdelta -Pspark-ut \
Expand Down
4 changes: 2 additions & 2 deletions docs/get-started/Velox.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ parent: Getting-Started

| Type | Version |
|-------|------------------------------|
| Spark | 3.2.2, 3.3.1, 3.4.4, 3.5.2 |
| Spark | 3.2.2, 3.3.1, 3.4.4, 3.5.4 |
| OS | Ubuntu20.04/22.04, Centos7/8 |
| jdk | openjdk8/jdk17 |
| scala | 2.12 |
Expand All @@ -18,7 +18,7 @@ parent: Getting-Started

Currently, with static build Gluten+Velox backend supports all the Linux OSes, but is only tested on **Ubuntu20.04/Ubuntu22.04/Centos7/Centos8**. With dynamic build, Gluten+Velox backend support **Ubuntu20.04/Ubuntu22.04/Centos7/Centos8** and their variants.

Currently, the officially supported Spark versions are 3.2.2, 3.3.1, 3.4.4 and 3.5.2.
Currently, the officially supported Spark versions are 3.2.2, 3.3.1, 3.4.4 and 3.5.4.

We need to set up the `JAVA_HOME` env. Currently, Gluten supports **java 8** and **java 17**.

Expand Down
2 changes: 1 addition & 1 deletion docs/get-started/build-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,4 +74,4 @@ It's name pattern is `gluten-<backend_type>-bundle-spark<spark.bundle.version>_<
| 3.2.2 | 3.2 | 2.12 |
| 3.3.1 | 3.3 | 2.12 |
| 3.4.4 | 3.4 | 2.12 |
| 3.5.2 | 3.5 | 2.12 |
| 3.5.4 | 3.5 | 2.12 |
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -336,7 +336,7 @@
<properties>
<sparkbundle.version>3.5</sparkbundle.version>
<sparkshim.artifactId>spark-sql-columnar-shims-spark35</sparkshim.artifactId>
<spark.version>3.5.2</spark.version>
<spark.version>3.5.4</spark.version>
<iceberg.version>1.5.0</iceberg.version>
<delta.package.name>delta-spark</delta.package.name>
<delta.version>3.2.0</delta.version>
Expand Down
2 changes: 1 addition & 1 deletion tools/gluten-it/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@
<profile>
<id>spark-3.5</id>
<properties>
<spark.version>3.5.2</spark.version>
<spark.version>3.5.4</spark.version>
<scala.library.version>2.12.18</scala.library.version>
</properties>
</profile>
Expand Down
Loading