Error: C stack usage is too close to the limit
Asked Answered
r
S

18

120

I'm attempting to run some fairly deep recursive code in R and it keeps giving me this error:

Error: C stack usage is too close to the limit

My output from CStack_info() is:

Cstack_info()
    size    current  direction eval_depth 
67108864       8120          1          2 

I have plenty of memory on my machine, I'm just trying to figure out how I can increase the CStack for R.

EDIT: Someone asked for a reproducible example. Here's some basic sample code that causes the problem. Running f(1,1) a few times you'll get the error. Note that I've already set --max-ppsize = 500000 and options(expressions=500000) so if you don't set those you might get an error about one of those two things instead. As you can see, the recursion can go pretty deep here and I've got no idea how to get it to work consistently. Thanks.

f <- function(root=1,lambda=1) {
    x <- c(0,1);
    prob <- c(1/(lambda+1),lambda/(lambda+1));
        repeat {
      if(root == 0) {
        break;
      }
      else {
        child <- sample(x,2,replace=TRUE,prob);
        if(child[1] == 0 && child[2] == 0) {
          break;
        }
        if(child[1] == 1) {
          child[1] <- f(root=child[1],lambda);
        }
        if(child[2] == 1 && child[1] == 0) {
          child[2] <- f(root=child[2],lambda);
        }
      }
      if(child[1] == 0 && child[2] == 0) {
        break;
      }
      if(child[1] == 1 || child[2] == 1) {
        root <- sample(x,1,replace=TRUE,prob);
      }
        }
    return(root)
}
Superficies answered 6/2, 2013 at 0:9 Comment(8)
This question suggests perhaps options(expressions = somethinglarge)Outnumber
@Outnumber The expression nesting depth, the pointer protection stack, and the C stack are three separate (but related) things.Benzine
Thanks so much for your prompt response, Zack. I think that your answer may be for a Linux OS though? I'm currently running Windows 7 64 bit, does that change things at all? Thanks again for any help.Superficies
My answer should be valid for any Unix variant (of which Linux and OSX are the most common nowadays) but ... yeah, I have no idea what the Windows equivalent is.Benzine
Googling the error message shows that in the past this has usually been an error in user code, so you should probably reduce your problem to a simple reproducible example and post that here.Singspiel
I'm not sure there is an error in the code at all. This is simply a case of probabilities that could in theory end up with infinite recursion. f(1,1) is basically flipping a coin. It could keep coming up heads forever. For a condition where the level of recursion is unknown and unbounded, you are better off coming up with something more iterative, using memoization of prior sample() results to inform future operations. Then the only thing you risk is running out of vector memory, or disk, depending on where you are storing your backlog of results. Recursion can be expensive and brittle.Pickmeup
The same error comes in a simple googlesheets::gs_ls(). There is no recursion as I am running it on the command line of rstuido. I use a mac. How is this harmless googlesheet ls command giving an error? Error: C stack usage 7970624 is too close to the limit seems strange. I could not search any answer till now. This is the closest thread to the problem but the marked answer does not give any fix.Underhand
had the same error with shiny, trying to runApp a file which also included a runApp statement. removing this duplicated statement fixed the issueLimousine
B
66

The stack size is an operating system parameter, adjustable per-process (see setrlimit(2)). You can't adjust it from within R as far as I can tell, but you can adjust it from the shell before starting R, with the ulimit command. It works like this:

$ ulimit -s # print default
8192
$ R --slave -e 'Cstack_info()["size"]'
   size 
8388608

8388608 = 1024 * 8192; R is printing the same value as ulimit -s, but in bytes instead of kilobytes.

$ ulimit -s 16384 # enlarge stack limit to 16 megs
$ R --slave -e 'Cstack_info()["size"]'
    size 
16777216 

To make a permanent adjustment to this setting, add the ulimit command to your shell startup file, so it's executed every time you log in. I can't give more specific directions than that, because it depends on exactly which shell you have and stuff. I also don't know how to do it for logging into a graphical environment (which will be relevant if you're not running R inside a terminal window).

Benzine answered 6/2, 2013 at 0:20 Comment(9)
...or just set it to unlimited.Conventicle
The RAppArmor package offers an interface to setrlimit(2). This functionality may become available in the ulimit package at some point.Prelect
This function no longer exists in the RAppArmor package. Any ideas where it went?Touraine
What is the fix for Windows?Submerged
@Shana I haven't the faintest idea. Ask a new question and specifically mention+tag Windows and hopefully someone who does know will answer.Benzine
Changing the limit will not resolve this. A recursive function will simply continue to run until the higher limit is reached.Thumbsdown
Do you have to set it every time you log in/every time the machine is rebooted?Alcazar
@Dr_Hope Yes. You can put the ulimit command in your .profile, or there may be a file in /etc that can be edited to change the limit system-wide.Benzine
This setting seems to have reset when I rebooted. Is there an option to permanently change it?Sheers
T
34

This error is not due to memory it is due to recursion. A function is calling itself. This isn't always obvious from examining the definition of only one function. To illustrate the point, here is a minimal example of 2 functions that call each other:

change_to_factor <- function(x){
  x <- change_to_character(x)
  as.factor(x)
} 

change_to_character <- function(x){
  x <- change_to_factor(x)
  as.character(x)
}

change_to_character("1")

Error: C stack usage 7971600 is too close to the limit

The functions will continue to call each other recursively and will theoretically never complete, even if you increase the limit it will still be exceeded. It is only checks within your system that prevent this from occurring indefinitely and consuming all of the compute resources of your machine. You need to alter the functions to ensure that they won't indefinitely call itself (or each other) recursively.

Thumbsdown answered 19/12, 2018 at 12:37 Comment(1)
Refusing a recursion narrows field of solving problems by computer. I better advise to use so called terminators in each recursively called function. The role of a terminator is to conditionally stop further recursive calling, The best way is to count how deep in recursion you are and stop it as soon as you reach given limit (before system error occurs).Mak
S
30

I suspect that, regardless of stack limit, you'll end up with recursions that are too deep. For instance, with lambda = Inf, f(1) leads to an immediate recursion, indefinitely. The depth of the recursion seems to be a random walk, with some probability r of going deeper, 1 - r of finishing the current recursion. By the time you've hit the stack limit, you've made a large number of steps 'deeper'. This implies that r > 1 / 2, and the very large majority of time you'll just continue to recurse.

Also, it seems like it is almost possible to derive an analytic or at least numerical solution even in the face of infinite recursion. One can define p as the probability that f(1) == 1, write implicit expressions for the 'child' states after a single iteration, and equate these with p, and solve. p can then be used as the chance of success in a single draw from a binomial distribution.

Singspiel answered 6/2, 2013 at 7:47 Comment(2)
here is actually hidden correct answer - make sure you don't get that deep in recusion...Sporophore
In my case, the error is caused by sourcing the same R script multiple times (i.e. in multiple R scripts) in my project.Tensimeter
S
13

This happened to me for a completely different reason. I accidentally created a superlong string while combining two columns:

output_table_subset = mutate(big_data_frame,
     combined_table = paste0(first_part, second_part, col = "_"))

instead of

output_table_subset = mutate(big_data_frame,
     combined_table = paste0(first_part, second_part, sep = "_"))

Took me for ever to figure it out as I never expected the paste to have caused the problem.

Social answered 22/4, 2015 at 8:21 Comment(1)
Same here, but I was doing a summarize. I had it like this: summarize( states = paste0(state,collapse=', ') ). When I should have done something like: summarize( states = paste0(sort(unique(state)),collapse=', ') ). Goal was to get a comma separated list of unique states available for each subgroup.Nasal
C
6

I encountered the same problem of receiving the "C stack usage is too close to the limit" error (albeit for another application than the one stated by user2045093 above). I tried zwol's proposal but it didn't work out.

To my own surprise, I could solve the problem by installing the newest version of R for OS X (currently: version 3.2.3) as well as the newest version of R Studio for OS X (currently: 0.99.840), since I am working with R Studio.

Hopefully, this may be of some help to you as well.

Confirmation answered 31/12, 2015 at 16:6 Comment(1)
I switched to a higher version of R. It worked once, but the error reappeared and is consistent now. Help!Armistead
S
4

Mine is perhaps a more unique case, but may help the few who have this exact problem:

My case has absolutely nothing to do with space usage, still R gave the:
C stack usage is too close to the limit

I had a defined function which is an upgrade of the base function:

saveRDS()

But,
Accidentally, this defined function was called saveRDS() instead of safe_saveRDS().
Thus, past that definition, when the code got to the line wihch actually uses saveRDS(...) (which calls the original base version, not the upgraded one), it gave the above error and crushed.

So, if you're getting that error when calling some saving function, see if you didn't accidentally run over it.

Striate answered 12/12, 2019 at 20:23 Comment(0)
A
2

One issue here can be that you're calling f inside itself

plop <- function(a = 2){
  pouet <- sample(a)
  plop(pouet)
}
plop()
Erreur : évaluations trop profondément imbriquées : récursion infinie / options(expressions=) ?
Erreur pendant l'emballage (wrapup) : évaluations trop profondément imbriquées : récursion infinie / options(expressions=) ?
Ablepsia answered 26/2, 2018 at 17:49 Comment(0)
D
2

I often include a commented-out source("path/to/file/thefile.R") line at the top of an R script, e.g. thefile.R, so I can easily copy-paste this into the terminal to run it. I get this error if I forget to comment out the line, since running the file runs the file, which runs the file, which runs the file, ...

If that is the cause, the solution is simple: comment out the line.

Desecrate answered 1/9, 2020 at 19:5 Comment(0)
B
2

On Linux, I have permanently increased the size of the stack and memlock memories by doing so :

sudo vi /etc/security/limits.conf 

Then, add the following lines at the end of the file.

* soft memlock unlimited
* hard memlock unlimited

* soft stack unlimited
* hard stack unlimited
Besprinkle answered 26/11, 2020 at 10:10 Comment(1)
Do you know which service to restart to avoid reboot?Fulgurant
D
1

For everyone's information, I am suddenly running into this with R 3.6.1 on Windows 7 (64-bit). It was not a problem before, and now stack limits seem to be popping up everywhere, when I try to "save(.)" data or even do a "save.image(.)". It's like the serialization is blowing these stacks away.

I am seriously considering dropping back to 3.6.0. Didn't happen there.

Downspout answered 17/7, 2019 at 1:50 Comment(0)
S
1

Not sure if we re listing issues here but it happened to me with leaflet(). I was trying to map a dataframe in which a date column was of class POSIXlt. Changing back to POSIXct solved the issue.

Saphena answered 18/11, 2020 at 11:24 Comment(0)
L
1

I faced the same issue. This problem won't be solved by reinstalling R or Rstudio or by increasing the stack size. Here is a solution that solved this problem -

If you are sourcing a.R inside b.R and at the same time sourcing b.R inside a.R, then the stack will fill up very fast.

Problem

This is the first file a.R in which b.R is sourced

#---- a.R File -----
source("/b.R")
...
...
#--------------------

This is the second file b.R, in which a.R is sourced

#---- b.R File -----
source("/a.R")
...
...
#--------------------

Solution Source only one file to avoid the recursive calling of files within each other

#---- a.R File -----
source("/b.R")
...
...
#--------------------

#---- b.R File -----
...
...
#--------------------

OR

#---- a.R File -----
...
...
...
#--------------------

#---- b.R File -----
source("/a.R")
...
...
#--------------------
Librium answered 19/12, 2022 at 18:13 Comment(0)
S
0

As Martin Morgan wrote... The problem is that you get too deep inside of recursion. If the recursion does not converge at all, you need to break it by your own. I hope this code is going to work, because It is not tested. However at least point should be clear here.

f <- function(root=1,lambda=1,depth=1) {
 if(depth > 256){
  return(NA)
 }
 x <- c(0,1);
 prob <- c(1/(lambda+1),lambda/(lambda+1));
 repeat {
  if(root == 0) {
    break;
  } else {
   child <- sample(x,2,replace=TRUE,prob);
   if(child[1] == 0 && child[2] == 0) {
     break;
   }
   if(child[1] == 1) {
     child[1] <- f(root=child[1],lambda,depth+1);
   }
   if(child[2] == 1 && child[1] == 0) {
     child[2] <- f(root=child[2],lambda,depth+1);
   }
  }
  if(child[1] == NA | child[2] == NA){
   return NA;
  }
  if(child[1] == 0 && child[2] == 0) {
    break;
  }
  if(child[1] == 1 || child[2] == 1) {
    root <- sample(x,1,replace=TRUE,prob);
  }
 }
 return(root)
}
Sporophore answered 12/5, 2015 at 17:56 Comment(0)
N
0

If you're using plot_ly check which columns you are passing. It seems that for POSIXdt/ct columns, you have to use as.character() before passing to plotly or you get this exception!

Nert answered 12/7, 2019 at 18:18 Comment(0)
M
0

Here is how I encountered this error message. I met this error message when I tried to print a data.table in the console. It turned out it was because I mistakenly made a super super long string (by using collapse in paste() when I shouldn't) in a column.

Michamichael answered 30/9, 2021 at 15:4 Comment(0)
A
0

The package caret has a function called createDataPartition that always results in error when the dataset to be partitioned has more than 1m rows. Just for your info.

Attention answered 20/9, 2022 at 6:34 Comment(0)
C
0

Many reasons could result in such errors. I would suggest to use debugonce(your_function) to find out which line of code causing the problem, and then solve it.

Chessa answered 21/12, 2023 at 10:42 Comment(0)
P
-1

Another way to cause the same problem:

library(debug)
mtrace(lapply)

The recursive call isn't as obvious here.

Proliferation answered 23/5, 2019 at 20:49 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.