-
Notifications
You must be signed in to change notification settings - Fork 225
[TOREE-556] Support Scala 2.13 #218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
7d29aae
to
2178e20
Compare
@pan3793 what is the status of this pr? |
@lresende I remember the |
Thank you for the updates @pan3793 I did not do much testing, but updated my env to spark 3.4.3 + scala 13 and at least it connects and establishes a spark session... will try to spend more time on validation during the week. |
"-deprecation", | ||
"-unchecked", | ||
"-feature", | ||
"-Xfatal-warnings", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To make the code support both Scala 2.12 and 2.13, we have to use some deprecated Scala API, so this scalac flag must be disabled.
object JavaConverters in package collection is deprecated (since 2.13.0): Use `scala.jdk.CollectionConverters` instead
|
||
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full) | ||
if (scalaBinaryVersion.value == "2.12") { | ||
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/scalamacros/paradise
In Scala 2.13, the plugin's functionality has been included in the compiler directly under the -Ymacro-annotations flag.
build.sbt
Outdated
lazy val scala213 = "2.13.8" | ||
lazy val defaultScalaVersion = sys.env.get("SCALA_VERSION") match { | ||
case Some("2.12") => scala212 | ||
case _ => scala213 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I make Scala 2.13 as the default version
@lresende this PR is ready to review now. |
also cc @requaos |
kindly ping @lresende ~ |
hello all, can this be merged? if there is additional work, is there anything I can help w/? |
hello all, can this be merged? what's pending on this? is there anything I can help w/? |
@lresende, since there are many users requesting Scala 2.13 support, I wonder if we can move this forward. Currently, the PR makes Toree work with both Scala 2.12 and Scala 2.13, but leaves Scala 2.13 as the default. After second thought, I think we should use Scala 2.12 as the default, this won't introduce any breaking changes. Given Spark 3.4 and prior are EOL, we should also move to Spark 3.5. I see you are preparing the 0.6.0 release. Maybe after that, we can move to Spark 4.x and Scala 2.13. WDYT? |
I agree, let me try to push a release candidate this weekend and we merge after the release? |
@lresende, if I change the default Scala version to 2.12, it's also fine to include this patch in 0.6. But it's also fine if you don't want to take risks and defer this to 0.7 |
It would be super helpful if we can have this in the latest release as well. |
My suggestion is to have a last release with 2.12 which supports old spark releases, and in the future, if needed, we can branch from it and provide patches, etc. And then we can focus on Scala 2.13 for future releases. Having said that, it does not mean we can't do one release after another, in a short period of time, mostly to have the support for both scala versions. |
if we do one release after another, when would be the Spark4 and scala 2.13 available? is there anything that I can help to accelerate the process? |
I have updated the PR to keep Scala 2.12 as the default, @lresende it's up to you to include this in the upcoming 0.6 or defer it to the next release. |
@pan3793 I was going to test this locally and noticed the conflicts. I have a rebased version locally but if you want to rebase to make sure there are not conflicts, please go ahead otherwise, i can push the changes from local |
@lresende I have resolved the conflicts |
CI passed, let me merge this to allow other PRs to have both Scala 2.12 and 2.13 CI coverage. |
Thank you for merging this. when would this be available in pypi? Do we support spark4 currently? |
@fangyh20 Spark 4 is not supported yet (I guess we need to make it support Java 17+ first, then try Spark 4.0), PRs are welcome |
Co-authored-by: Neil Skinner [email protected]
Co-authored-by: Cheng Pan [email protected]
This PR is based on #199, and makes the project compatible with both Scala 2.12 (default) and 2.13.
Usage:
or
or
CI is updated to cover both Scala 2.12 and 2.13 tests. I also tested some basic functionalities locally with the docker environment.