I am running an R script on databricks notebook for multiple datasets (around 500). I ordered datasets by file size to avoid errors and run the max amount of files within the shortest time because the script has large time complexity.
I was able to finish 400/500 datasets without any issues but the large files keep giving the error:
RserveException: eval failed
eval failed
The weird thing about the error is that sometimes when I run the notebook again it works without any issues for the same dataset. However, 99% of the time I get the same error for bigger files. There is no error code or any explanation when expanding the error code. I researched this problem and most people have this error with an error code, and as far as I understand it is something to do with R version or some of the libraries I installed (cluster scoped) but I cannot figure it out.
Any ideas?