8000 GC overhead limit when performing fastOptJS with 2gb · Issue #3767 · scala-js/scala-js · GitHub
[go: up one dir, main page]

Skip to content

GC overhead limit when performing fastOptJS with 2gb #3767

@matthughes

Description

@matthughes

Using ScalaJS 0.6.28 / Scala 2.12.9 / JDK11 / SBT 1.2.7

Get the following stack trace blow up quite regularly when running ;clean;test at the root of my multi-module project. Pretty standard setup of commonJVM/commonJS/client/server.

[info] Fast optimizing client/target/scala-2.12/scalajs-bundler/test/client-test-fastopt.js
[debug] Forcing garbage collection...
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded
[error] 	at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
[error] 	at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.take(ConcurrentRestrictions.scala:213)
[error] 	at sbt.Execute.next$1(Execute.scala:110)
[error] 	at sbt.Execute.processAll(Execute.scala:113)
[error] 	at sbt.Execute.runKeep(Execute.scala:90)
[error] 	at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:420)
[error] 	at sbt.EvaluateTask$.run$1(EvaluateTask.scala:419)
[error] 	at sbt.EvaluateTask$.runTask(EvaluateTask.scala:438)
[error] 	at sbt.internal.Aggregation$.$anonfun$timedRun$4(Aggregation.scala:99)
[error] 	at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:355)
[error] 	at sbt.internal.Aggregation$.timedRun(Aggregation.scala:97)
[error] 	at sbt.internal.Aggregation$.runTasks(Aggregation.scala:111)
[error] 	at sbt.internal.Aggregation$.$anonfun$applyTasks$1(Aggregation.scala:67)
[error] 	at sbt.Command$.$anonfun$applyEffect$2(Command.scala:137)
[error] 	at sbt.internal.Aggregation$.$anonfun$evaluatingParser$11(Aggregation.scala:212)
[error] 	at sbt.internal.Act$.$anonfun$actParser0$3(Act.scala:435)
[error] 	at sbt.Command$.process(Command.scala:181)
[error] 	at sbt.MainLoop$.processCommand(MainLoop.scala:151)
[error] 	at sbt.MainLoop$.$anonfun$next$2(MainLoop.scala:139)
[error] 	at sbt.State$$anon$1.runCmd$1(State.scala:246)
[error] 	at sbt.State$$anon$1.process(State.scala:250)
[error] 	at sbt.MainLoop$.$anonfun$next$1(MainLoop.scala:139)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] 	at sbt.MainLoop$.next(MainLoop.scala:139)
[error] 	at sbt.MainLoop$.run(MainLoop.scala:132)
[error] 	at sbt.MainLoop$.$anonfun$runWithNewLog$1(MainLoop.scala:110)
[error] 	at sbt.io.Using.apply(Using.scala:22)
[error] 	at sbt.MainLoop$.runWithNewLog(MainLoop.scala:104)
[error] 	at sbt.MainLoop$.runAndClearLast(MainLoop.scala:59)
[error] 	at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:44)
[error] 	at sbt.MainLoop$.runLogged(MainLoop.scala:35)
[error] 	at sbt.StandardMain$.runManaged(Main.scala:138)
[error] 	at sbt.xMain.run(Main.scala:89)
[error] 	at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:111)
[error] 	at xsbt.boot.Launch$.withContextLoader(Launch.scala:130)
[error] 	at xsbt.boot.Launch$.run(Launch.scala:111)
[error] 	at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:37)
[error] 	at xsbt.boot.Launch$.launch(Launch.scala:119)
[error] 	at xsbt.boot.Launch$.apply(Launch.scala:20)
[error] 	at xsbt.boot.Boot$.runImpl(Boot.scala:56)
[error] 	at xsbt.boot.Boot$.main(Boot.scala:18)
[error] 	at xsbt.boot.Boot.main(Boot.scala)
[error] Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
[error] 	at java.base/java.io.DataInputStream.readUTF(DataInputStream.java:666)
[error] 	at java.base/java.io.DataInputStream.readUTF(DataInputStream.java:569)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.$anonfun$readStrings$1(InfoSerializers.scala:103)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer$$Lambda$28170/0x00000008046d9040.apply(Unknown Source)
[error] 	at scala.collection.generic.GenTraversableFactory.fill(GenTraversableFactory.scala:89)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readList(InfoSerializers.scala:100)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readStrings(InfoSerializers.scala:103)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.$anonfun$deserialize$2(InfoSerializers.scala:124)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer$$Lambda$28169/0x00000008046de040.apply(Unknown Source)
[error] 	at scala.collection.generic.GenTraversableFactory.fill(GenTraversableFactory.scala:89)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readList(InfoSerializers.scala:100)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readPerClassStrings$1(InfoSerializers.scala:124)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readMethod$1(InfoSerializers.scala:136)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.$anonfun$deserialize$3(InfoSerializers.scala:150)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer$$Lambda$28168/0x00000008046df840.apply(Unknown Source)
[error] 	at scala.collection.generic.GenTraversableFactory.fill(GenTraversableFactory.scala:89)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.readList(InfoSerializers.scala:100)
[error] 	at org.scalajs.core.ir.InfoSerializers$Deserializer.deserialize(InfoSerializers.scala:150)
[error] 	at org.scalajs.core.ir.InfoSerializers$.deserializeWithVersion(InfoSerializers.scala:38)
[error] 	at org.scalajs.core.tools.io.VirtualSerializedScalaJSIRFile.infoAndTree(VirtualFiles.scala:178)
[error] 	at org.scalajs.core.tools.io.VirtualSerializedScalaJSIRFile.infoAndTree$(VirtualFiles.scala:175)
[err
65F0
or] 	at org.scalajs.core.tools.io.MemVirtualSerializedScalaJSIRFile.infoAndTree(MemFiles.scala:107)
[error] 	at org.scalajs.core.tools.io.VirtualScalaJSIRFile.tree(VirtualFiles.scala:146)
[error] 	at org.scalajs.core.tools.io.VirtualScalaJSIRFile.tree$(VirtualFiles.scala:145)
[error] 	at org.scalajs.core.tools.io.MemVirtualSerializedScalaJSIRFile.tree(MemFiles.scala:107)
[error] 	at org.scalajs.core.tools.io.IRFileCache$PersistentIRFile.$anonfun$loadTree$1(IRFileCache.scala:271)
[error] 	at org.scalajs.core.tools.io.IRFileCache$PersistentIRFile$$Lambda$28431/0x0000000804cdd840.apply$mcV$sp(Unknown Source)
[error] 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
[error] 	at org.scalajs.core.tools.io.IRFileCache.org$scalajs$core$tools$io$IRFileCache$$clearOnThrow(IRFileCache.scala:283)
[error] 	at org.scalajs.core.tools.io.IRFileCache$PersistentIRFile.loadTree(IRFileCache.scala:269)
[error] 	at org.scalajs.core.tools.io.IRFileCache$PersistentIRFile.tree(IRFileCache.scala:258)
[error] 	at org.scalajs.core.tools.linker.frontend.BaseLinker.$anonfun$linkInternal$2(BaseLinker.scala:107)
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: GC overhead limit exceeded
[error] Use 'last' for the full log.

I wouldn't really know where to begin in trying to minimize this. I don't know SJS internals well enough and it may be a red herring but it seems like decoding the SJS IR files is causing the heap to blow. If it helps, the top ten sjsir files for this project range in size from 220K-624K (using find . -name "*.sjsir" -size +100k | xargs ls -lhsSr | grep -v bloop | tail -n 10 on my project).

Metadata

Metadata

Assignees

No one assigned

    Labels

    invalidContains a factual error which makes the report invalid. Closed.wontfixWe decided not to fix this issue/not implement that feature request.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0