See the resourceLimits option you can pass to the Worker constructor.
Quoting from the doc:
resourceLimits <Object>
An optional set of resource limits for the new JS engine instance.
Reaching these limits will lead to termination of the Worker instance.
These limits only affect the JS engine, and no external data,
including no ArrayBuffers. Even if these limits are set, the process
may still abort if it encounters a global out-of-memory situation.
maxOldGenerationSizeMb The maximum size of the main heap in
MB.
maxYoungGenerationSizeMb The maximum size of a heap space
for recently created objects.
codeRangeSizeMb The size of a
pre-allocated memory range used for generated code.
I've not seen anything written about how nodejs chooses those defaults when you don't fill them in. But specifying them gives you some control.
I would add that nodejs is not a particularly memory efficient environment when trying to fit into a resource constrained target. It keeps things around in memory that are probably not really needed like all the original source code it compiles. The GC nature of it gives it a wider range of memory usage than manually controlled allocations. It's also particularly bad at reading large sets of data all at once and parsing it into lots of small strings as that can lead to high peak memory usage even though lots of it gets garbage collected when things calm down.
As another example, you could read/parse a 10MB JSON file and have that take up 10x that amount of memory to store as Javascript data. This is not meant as an indictment of nodejs in general (I'm a big fan), but one has to be a lot more careful with how you use the tool if you're trying to fit in smaller amounts of memory. A Worker Thread is a bunch of extra memory overhead (it's own heap, it's own compiled code, etc...), much more so than in something like C/C++.