RserveException: eval failed eval failed error on Databricks notebook - No error code or no explanation
Asked Answered
C

0

6

I am running an R script on databricks notebook for multiple datasets (around 500). I ordered datasets by file size to avoid errors and run the max amount of files within the shortest time because the script has large time complexity.

I was able to finish 400/500 datasets without any issues but the large files keep giving the error:

RserveException: eval failed

eval failed

The weird thing about the error is that sometimes when I run the notebook again it works without any issues for the same dataset. However, 99% of the time I get the same error for bigger files. There is no error code or any explanation when expanding the error code. I researched this problem and most people have this error with an error code, and as far as I understand it is something to do with R version or some of the libraries I installed (cluster scoped) but I cannot figure it out.

Any ideas?

Calciferous answered 21/1, 2022 at 16:7 Comment(2)
You can check the logs when the job is running - go to spark UI, check details of the active task, see the stderr output. It is happening most likely due to out of memory issue as you mentioned that it fails for bigger data - in that case try to use Spark functions instead of base R to use distributed computing powerDextroglucose
Thank you, Vivek! The issue was related to memory outage but couldn't figure out why the error message was very vague. I had to run those big files in my local, and edit memory allocated within the RStudio and it worked fine!Calciferous

© 2022 - 2024 — McMap. All rights reserved.