Skip to content

Conversation

pan3793
Copy link
Member

@pan3793 pan3793 commented Aug 20, 2023

Co-authored-by: Neil Skinner [email protected]
Co-authored-by: Cheng Pan [email protected]

This PR is based on #199, and makes the project compatible with both Scala 2.12 (default) and 2.13.

Usage:

sbt ++2.12 clean test
sbt ++2.13 clean test

or

SCALA_VERSION=2.12 sbt clean test
SCALA_VERSION=2.13 sbt clean test

or

make SCALA_VERSION=2.12 dev
make SCALA_VERSION=2.13 dev

CI is updated to cover both Scala 2.12 and 2.13 tests. I also tested some basic functionalities locally with the docker environment.

@pan3793 pan3793 marked this pull request as draft August 20, 2023 18:29
@pan3793 pan3793 force-pushed the PR_199 branch 2 times, most recently from 7d29aae to 2178e20 Compare August 29, 2023 04:06
@lresende
Copy link
Member

@pan3793 what is the status of this pr?

@pan3793
Copy link
Member Author

pan3793 commented Aug 26, 2024

@lresende I remember the coursier log API changed significantly, and we need to adapt the new API to recover the log display, let me find time these weeks to fix it and finish the scala 2.13 support.

@lresende
Copy link
Member

lresende commented Sep 2, 2024

Thank you for the updates @pan3793 I did not do much testing, but updated my env to spark 3.4.3 + scala 13 and at least it connects and establishes a spark session... will try to spend more time on validation during the week.

"-deprecation",
"-unchecked",
"-feature",
"-Xfatal-warnings",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To make the code support both Scala 2.12 and 2.13, we have to use some deprecated Scala API, so this scalac flag must be disabled.

object JavaConverters in package collection is deprecated (since 2.13.0): Use `scala.jdk.CollectionConverters` instead


addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
if (scalaBinaryVersion.value == "2.12") {
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://github.com/scalamacros/paradise

In Scala 2.13, the plugin's functionality has been included in the compiler directly under the -Ymacro-annotations flag.

build.sbt Outdated
lazy val scala213 = "2.13.8"
lazy val defaultScalaVersion = sys.env.get("SCALA_VERSION") match {
case Some("2.12") => scala212
case _ => scala213
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I make Scala 2.13 as the default version

@pan3793 pan3793 changed the title [DRAFT] Support Scala 2.13 [TOREE-556] Support Scala 2.13 Sep 4, 2024
@pan3793 pan3793 marked this pull request as ready for review September 4, 2024 12:42
@pan3793
Copy link
Member Author

pan3793 commented Sep 4, 2024

@lresende this PR is ready to review now.

@pan3793
Copy link
Member Author

pan3793 commented Sep 4, 2024

also cc @requaos

@pan3793
Copy link
Member Author

pan3793 commented Oct 18, 2024

kindly ping @lresende ~

@legiondean
Copy link

@pan3793, @lresende , @requaos
do we feel comfortable to merge the PR to support 2.13?

@jjmeyer0
Copy link

hello all, can this be merged? if there is additional work, is there anything I can help w/?

@fangyh20
Copy link

hello all, can this be merged? what's pending on this? is there anything I can help w/?

@pan3793
Copy link
Member Author

pan3793 commented Sep 16, 2025

@lresende, since there are many users requesting Scala 2.13 support, I wonder if we can move this forward.

Currently, the PR makes Toree work with both Scala 2.12 and Scala 2.13, but leaves Scala 2.13 as the default. After second thought, I think we should use Scala 2.12 as the default, this won't introduce any breaking changes. Given Spark 3.4 and prior are EOL, we should also move to Spark 3.5. I see you are preparing the 0.6.0 release. Maybe after that, we can move to Spark 4.x and Scala 2.13. WDYT?

@lresende
Copy link
Member

I agree, let me try to push a release candidate this weekend and we merge after the release?

@pan3793
Copy link
Member Author

pan3793 commented Sep 16, 2025

@lresende, if I change the default Scala version to 2.12, it's also fine to include this patch in 0.6. But it's also fine if you don't want to take risks and defer this to 0.7

@fangyh20
Copy link

It would be super helpful if we can have this in the latest release as well.

@lresende
Copy link
Member

My suggestion is to have a last release with 2.12 which supports old spark releases, and in the future, if needed, we can branch from it and provide patches, etc. And then we can focus on Scala 2.13 for future releases. Having said that, it does not mean we can't do one release after another, in a short period of time, mostly to have the support for both scala versions.

@fangyh20
Copy link

if we do one release after another, when would be the Spark4 and scala 2.13 available? is there anything that I can help to accelerate the process?

@pan3793
Copy link
Member Author

pan3793 commented Sep 20, 2025

I have updated the PR to keep Scala 2.12 as the default, @lresende it's up to you to include this in the upcoming 0.6 or defer it to the next release.

@lresende
Copy link
Member

lresende commented Oct 6, 2025

@pan3793 I was going to test this locally and noticed the conflicts. I have a rebased version locally but if you want to rebase to make sure there are not conflicts, please go ahead otherwise, i can push the changes from local

@pan3793
Copy link
Member Author

pan3793 commented Oct 6, 2025

@lresende I have resolved the conflicts

@pan3793
Copy link
Member Author

pan3793 commented Oct 7, 2025

CI passed, let me merge this to allow other PRs to have both Scala 2.12 and 2.13 CI coverage.

@pan3793 pan3793 merged commit e48d434 into apache:master Oct 7, 2025
4 checks passed
This was referenced Oct 7, 2025
@fangyh20
Copy link

fangyh20 commented Oct 7, 2025

Thank you for merging this. when would this be available in pypi? Do we support spark4 currently?

@lresende
Copy link
Member

lresende commented Oct 7, 2025

@fangyh20 i am waiting for #229 and will try to make a rc

@pan3793
Copy link
Member Author

pan3793 commented Oct 8, 2025

@fangyh20 Spark 4 is not supported yet (I guess we need to make it support Java 17+ first, then try Spark 4.0), PRs are welcome

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants