-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Support JDK 17 PermittedSubclasses
in classfiles
#10105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
0cf154b
to
da5ed4f
Compare
sealed
feature in Java sources and Java-generated classfiles
Does Scala's |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had some WIP branch for this, sorry I didn't share... LGTM already.
In my branch I included Flags.SEALED
in the modifiers of the ClassDef
, which triggers population of children
in the namer (IIRC) and enables exhaustivity checking. However, that's not enough, because Java symbols are only completed lazily, so only completed symbols are going to be known children. We can defer that to later I guess.
src/compiler/scala/tools/nsc/symtab/classfile/ClassfileParser.scala
Outdated
Show resolved
Hide resolved
@som-snytt care to respond to the review feedback? this one would be really nice to get into 2.13.11 |
That is not sarcastic 👍 . I don't know if we need to stipulate that every time now? |
Is that a reference to https://www.walesonline.co.uk/news/uk-news/generation-z-ers-giving-passive-25249315 ? In my world, thumbs-up are always assumed to be sincere 👍 |
I'm aware that 👎 is always sincere. |
0710c7b
to
20e7535
Compare
Children added per PermittedSubclasses. Drafting to support permits clause. |
@som-snytt this still seems like a high-value PR to me; let us know if you could use any input from the team in helping it get across the finish line |
@SethTisue thanks, I just checked out this branch today as part of my periodic attempt to "groom" and "reduce" my open PRs. Please let me know if 2.13.11 is imminent. |
20e7535
to
62288fd
Compare
62288fd
to
7bba454
Compare
Well, it won't come out before 3.3.0. I started https://contributors.scala-lang.org/t/scala-2-13-11-release-planning/6088 just now. |
With any luck, a rebase on top of #10310 will fix Travis-CI. :knockonwood: |
7bba454
to
a4ae8ef
Compare
This PR gets you to ISS, if you want to dock with Tiangong then you need the other PR. |
42bf9bb
to
3afe1a7
Compare
@SethTisue rebased in honor of today's Starship test launch. Can we merge this? Then I can rebase the follow-up for mixed compilation. |
Just to provide some reference blog post on java sealed classes |
If we split up the support for Java
So I suggest removing the |
Since we're so close to release, having two separate PRs makes me uncomfortable as well. If there are two PRs, we can only merge the first one if it would be acceptable (if not ideal) to release even if the second PR has to be pushed to 2.13.12. But surely it makes the most sense to either include both, or neither? I think it's confusing to users to ship partial support. (Som, I understand that your intention is not to ship partial support, but regardless, I think being this close to release time forces us to be more careful than we might otherwise be.) |
3afe1a7
to
80ee36a
Compare
Reverted to Aug 10, when flag was not set. |
sealed
feature in Java sources and Java-generated classfilessealed
feature in Java-generated classfiles
sealed
feature in Java-generated classfilesPermittedSubclasses
in classfiles
Accept (non-)sealed in Java sources. Defer setting sealed from source to follow-up PR, which will populate child classes lazily.
80ee36a
to
5849a58
Compare
a bit of followup at #10383 |
@SethTisue from now on, I'm only going to use the word "support" in the sense of "moral support". |
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes #41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to upgrade Scala to 2.13.11 - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3, ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test - Checked Java 8/17 + Scala 2.13.11 using GA, all test passed Java 8 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14337279564 Java 17 + Scala 2.13.11: https://github.com/LuciferYang/spark/runs/14343012195 Closes apache#41626 from LuciferYang/SPARK-40497. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in #41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 ### Why are the changes needed? This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 ### Does this PR introduce _any_ user-facing change? Yes, this is a Scala version change. ### How was this patch tested? - Existing Test ### Was this patch authored or co-authored using generative AI tooling? No Closes #42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
This PR aims to re-upgrade Scala to 2.13.11, after SPARK-45144 was merged, the build issues mentioned in apache#41943 should no longer exist. - https://www.scala-lang.org/news/2.13.11 Additionally, this pr adds a new suppression rule for warning message: `Implicit definition should have explicit type`, this is a new compile check introduced by scala/scala#10083, we must fix it when we upgrading to use Scala 3 This release improves collections, adds support for JDK 20 and 21, adds support for JDK 17 `sealed`: - scala/scala#10363 - scala/scala#10184 - scala/scala#10397 - scala/scala#10348 - scala/scala#10105 There are 2 known issues in this version: - scala/bug#12800 - scala/bug#12799 For the first one, there is no compilation warning messages related to `match may not be exhaustive` in Spark compile log, and for the second one, there is no case of `method.isAnnotationPresent(Deprecated.class)` in Spark code, there is just https://github.com/apache/spark/blob/8c84d2c9349d7b607db949c2e114df781f23e438/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala#L130 in Spark Code, and I checked `javax.annotation.Nonnull` no this issue. So I think These two issues will not affect Spark itself, but this doesn't mean it won't affect the code written by end users themselves The full release notes as follows: - https://github.com/scala/scala/releases/tag/v2.13.11 Yes, this is a Scala version change. - Existing Test No Closes apache#42918 from LuciferYang/SPARK-40497-2. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Fixes scala/bug#12171
Fixes scala/bug#12159
Also adds test coverage for scala/bug#12474
ClassfileParser
hint from #9228Implementation: a quick hack takes
non-sealed
in Java sources as three tokens.