Skip to content

Conversation

@davesmith00000
Copy link
Contributor

Relates to: #6206

I'm trying to organise local testing.

@davesmith00000
Copy link
Contributor Author

Testing results are inconclusive.

I'm testing on a Mill based monorepo with ~25 modules, most of which are Scala.js and many of those have Scala.js test modules, and all I'm doing is manually running:

./mill clean
./mill __.compile
./mill __.fastLinkJS

The first two operations always succeed and it gives fastLinkJS a clear run.

With this PR, you can add the following line to ScalaJSModules and ScalaJSTests modules:

override def scalaJSParallel = Task { false }

With parallel set to true or false, both builds fail during the fastLinkJS phase.

With parallel set to false, on my machine, the build starts to throw OutOfMemory exceptions once the job parallelism gets to 10.

With parallel set to true, the build actually gets a little further, but grinds to a halt doing 8 concurrent jobs.

The next thing would probably be to start profiling, but as a feature - based on the visible behaviour change - I feel confident saying that it is probably doing the right thing (i.e. changing the linker flag) ...but it doesn't solve the problem of Mill crashing when there are a high number of concurrent Scala.js builds, and as such, I'm ambivalent about whether it's worth continuing here or not.

@lefou
Copy link
Member

lefou commented Nov 19, 2025

I wonder if scalaJsParallel should be an task or just a def. It's values should not affect or invalidate the results of any other task. Running in parallel or not should be transparent. The result should always be the same.

Also, a good default could be based on the value of TaskCtx.jobs, but since TaskCtx.jobs is only available inside a task, we need the don't care state, so we would need a def scalaJsParallel: Option[Boolean] = None and default to non-parallel mode when Task.jobs != 1.

@davesmith00000
Copy link
Contributor Author

I wonder if scalaJsParallel should be an task or just a def

I was 100% cargo-culting an existing flag. If you think it shouldn't be a task or if the naming feels wrong / non-idiomatic I'm happy to alter it.

Also, a good default could be based on the value of TaskCtx.jobs

I don't think they're particularly related. In my case for example, I'm sometimes saying ./mill -j1 __.fastLinkJS because Mill is too aggressive with Scala.js modules, but that the Scala.js modules themselves are running in parallel makes little difference. But I see where you're coming from. There is some user intent to limit parallelism / concurrency there. Happy to be advised.

I briefly showed this to @sjrd (hope you don't mind me sharing!) and he provided a relevant comment (both, I think, to my wider problem and to the comment above about parallelism default values):

The parallelism of the linker is not the issue. It's running several linkers in parallel. Because each consumes a lot of memory while it's working. For one linker, running it in parallel or not won't change how much memory it consumes. What you need to do is prevent several linkers to run in parallel on separate projects.

@lefou
Copy link
Member

lefou commented Nov 20, 2025

So if it is advised to not run multiple ScalaJS linkers in parallel, or limit the overall parallelism of these jobs, we could manage and limit the maximal number of parallel running linker processes. Meaning, we just wait and block until a slot becomes free. This should be implemented in ScalaJSWorker, which is already shared between all ScalaJsModules.

This is of course a different solution to the current state of this PR. I think, the proposed ScalaJSModule.scalaJSParallel isn't needed.

@davesmith00000
Copy link
Contributor Author

Yes indeed. The PR isn't invalid or wrong (sbt exposes this linker option), it just doesn't help me in the way I hoped it might. Happy for it to be closed without merging if you don't want it.

@lefou
Copy link
Member

lefou commented Nov 20, 2025

if you don't want it.

Asking @lolgab, our ScalaJS expert, if we want it. I will not merge it as-is for the reasons outlined above: #6207 (comment)

I wonder if scalaJsParallel should be an task or just a def. It's values should not affect or invalidate the results of any other task. Running in parallel or not should be transparent. The result should always be the same.

@lolgab
Copy link
Member

lolgab commented Nov 20, 2025

if you don't want it.

Asking @lolgab, our ScalaJS expert, if we want it. I will not merge it as-is for the reasons outlined above: #6207 (comment)

I wonder if scalaJsParallel should be a task or just a def. It's values should not affect or invalidate the results of any other task. Running in parallel or not should be transparent. The result should always be the same.

I think it's valuable to expose the settings if it's available as a setting in Scala.js.

If we make it a def scalaJSParallel: String then it can't depend on scalaJSVersion, etc.
I think it's better to use a Task as we do for the other Scala.js settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants