Assuming that the computer running this program has an infinite amount of memory, I'm interested in where Python will break when running the following:
For fun, I implemented hyperoperators in python as the module hyperop
. One of my examples is Graham's number:
def GrahamsNumber():
# This may take awhile...
g = 4
for n in range(1,64+1):
g = hyperop(g+2)(3,3)
return g
The condensed version of the class hyperop
looks like this:
def __init__(self, n):
self.n = n
self.lower = hyperop(n - 1)
def _repeat(self, a, b):
if self.n == 1:
yield a
i = 1
while True:
yield a
if i == b:
break
i += 1
def __call__(self, a, b):
return reduce(lambda x, y: self.lower(y, x), self._repeat(a, b))
Essentially the library is just a recursive fold-right operation, with a special definition for the base case of n=1. Originally __call__
was beautifully golfed as:
return reduce(lambda x, y: self.lower(y, x), [a,]*b)
However, it turns out that you can't make a list with more elements than the size of a C long. That was a fun limitation that most Python programmers probably don't encounter in their normal day-to-day and it inspired the following question.
Where, if at all, will the
hyperop
calculation fail due to a technical limitation of python (specifically 2.7.10)?
yield
and "infinite sequences", or something that can be pinned down more succinctly? – Espouse