How to implement a smooth clamp function in python?
Asked Answered
M

3

13

The clamp function is clamp(x, min, max) = min if x < min, max if x > max, else x

I need a function that behaves like the clamp function, but is smooth (i.e. has a continuous derivative).

Mannes answered 18/7, 2017 at 11:25 Comment(1)
Essentially, after rescaling you can read this problem as a convolution of the heaviside function. The kernel you choose determines the kind of approximation you perform. If your kernel has a infinite support you'll end up with a kind of sigmoidal approximation. If it has finite support, the function will reach min and max valuesJerrodjerrol
M
11

Normal clamp:

np.clip(x, mi, mx)

Smoothclamp (guaranteed to agree with normal clamp for x < min and x > max):

def smoothclamp(x, mi, mx): return mi + (mx-mi)*(lambda t: np.where(t < 0 , 0, np.where( t <= 1 , 3*t**2-2*t**3, 1 ) ) )( (x-mi)/(mx-mi) )

Sigmoid (Approximates clamp, never smaller than min, never larger than max)

def sigmoid(x,mi, mx): return mi + (mx-mi)*(lambda t: (1+200**(-t+0.5))**(-1) )( (x-mi)/(mx-mi) )

For some purposes Sigmoid will be better than Smoothclamp because Sigmoid is an invertible function - no information is lost.

For other purposes, you may need to be certain that f(x) = xmax for all x > xmax - in that case Smoothclamp is better. Also, as mentioned in another answer, there is a whole family of Smoothclamp functions, though the one given here is adequate for my purposes (no special properties other than a smooth derivative needed)

Plot them:

import numpy as np
import matplotlib.pyplot as plt
fig, ax = plt.subplots(1, 1)
x = np.linspace(-4,7,1000)
ax.plot(x, np.clip(x, -1, 4),'k-', lw=2, alpha=0.8, label='clamp')
ax.plot(x, smoothclamp(x, -1, 4),'g-', lw=3, alpha=0.5, label='smoothclamp')
ax.plot(x, sigmoid(x, -1, 4),'b-', lw=3, alpha=0.5, label='sigmoid')
plt.legend(loc='upper left')
plt.show()

graph

Also of potential use is the arithmetic mean of these two:

def clampoid(x, mi, mx): return mi + (mx-mi)*(lambda t: 0.5*(1+200**(-t+0.5))**(-1) + 0.5*np.where(t < 0 , 0, np.where( t <= 1 , 3*t**2-2*t**3, 1 ) ) )( (x-mi)/(mx-mi) )
Mannes answered 18/7, 2017 at 11:25 Comment(9)
Built-in functions min and max would not require numpy.Rsfsr
But is the area under the curve preserved?Mikkanen
@Rsfsr agreed, but then when you use them on an array/pandas series etc you will get the "truth value of a series is ambiguous" problem.Mannes
@guidot, another good reason to use numpy functions - they are vectorized. So you can apply such functions on the entire vector (matrix) without looping, which is usually order of magnitude faster.Grandnephew
FYI: For the normal clamp, you can use numpy.clip.Klaraklarika
@MaxU: The max function already accepts an iterable as first argument. What additional benefit provides vectorization?Rsfsr
@guidot, consider the following example: import numpy as np; a = np.random.rand(10**6); %timeit max(a); %timeit np.max(a); - on my PC the vectorized Numpy function took 127 times less time ;-)Grandnephew
@MaxU: While the factor was even 137 on my machine, the factor shrank to 45 if a simple list was given to built-in max instead of a numpy array. Interesting to know the numpy way for hard number crunching but the standard solution is not that awful.Rsfsr
I'm accepting my own answer since it got more upvotesMannes
T
13

What you are looking for is something like the Smoothstep function, which has a free parameter N, giving the "smoothness", i.e. how many derivatives should be continuous. It is defined as such:

enter image description here

This is used in several libraries and can be implemented in numpy as

import numpy as np
from scipy.special import comb

def smoothstep(x, x_min=0, x_max=1, N=1):
    x = np.clip((x - x_min) / (x_max - x_min), 0, 1)

    result = 0
    for n in range(0, N + 1):
         result += comb(N + n, n) * comb(2 * N + 1, N - n) * (-x) ** n

    result *= x ** (N + 1)

    return result

It reduces to the regular clamp function given N=0 (0 times differentiable), and gives increasing smoothness as you increase N. You can visualize it like this:

import matplotlib.pyplot as plt

x = np.linspace(-0.5, 1.5, 1000)

for N in range(0, 5):
    y = smoothstep(x, N=N)
    plt.plot(x, y, label=str(N))

plt.legend()

which gives this result:

enter image description here

Topping answered 18/7, 2017 at 11:57 Comment(5)
Great answer. For my purposes I prefer my implementation (obviously I am biased!) because it's a one-liner and the n=1 smoothstep does exactly what I want - I don't care about the second derivative being smooth. Thanks!Mannes
Sure, just chiming in with another option. Yours is obviously easier but this works very well for more general applications.Topping
@JonasAdler not sure this implementation really works with different min or max. The other one by RokoMijic scalesEmsmus
Are you sure? I tried it and it works well for me.Topping
because it's a one-liner. In this case, a very long one that is hard to parse for humans.Octave
M
11

Normal clamp:

np.clip(x, mi, mx)

Smoothclamp (guaranteed to agree with normal clamp for x < min and x > max):

def smoothclamp(x, mi, mx): return mi + (mx-mi)*(lambda t: np.where(t < 0 , 0, np.where( t <= 1 , 3*t**2-2*t**3, 1 ) ) )( (x-mi)/(mx-mi) )

Sigmoid (Approximates clamp, never smaller than min, never larger than max)

def sigmoid(x,mi, mx): return mi + (mx-mi)*(lambda t: (1+200**(-t+0.5))**(-1) )( (x-mi)/(mx-mi) )

For some purposes Sigmoid will be better than Smoothclamp because Sigmoid is an invertible function - no information is lost.

For other purposes, you may need to be certain that f(x) = xmax for all x > xmax - in that case Smoothclamp is better. Also, as mentioned in another answer, there is a whole family of Smoothclamp functions, though the one given here is adequate for my purposes (no special properties other than a smooth derivative needed)

Plot them:

import numpy as np
import matplotlib.pyplot as plt
fig, ax = plt.subplots(1, 1)
x = np.linspace(-4,7,1000)
ax.plot(x, np.clip(x, -1, 4),'k-', lw=2, alpha=0.8, label='clamp')
ax.plot(x, smoothclamp(x, -1, 4),'g-', lw=3, alpha=0.5, label='smoothclamp')
ax.plot(x, sigmoid(x, -1, 4),'b-', lw=3, alpha=0.5, label='sigmoid')
plt.legend(loc='upper left')
plt.show()

graph

Also of potential use is the arithmetic mean of these two:

def clampoid(x, mi, mx): return mi + (mx-mi)*(lambda t: 0.5*(1+200**(-t+0.5))**(-1) + 0.5*np.where(t < 0 , 0, np.where( t <= 1 , 3*t**2-2*t**3, 1 ) ) )( (x-mi)/(mx-mi) )
Mannes answered 18/7, 2017 at 11:25 Comment(9)
Built-in functions min and max would not require numpy.Rsfsr
But is the area under the curve preserved?Mikkanen
@Rsfsr agreed, but then when you use them on an array/pandas series etc you will get the "truth value of a series is ambiguous" problem.Mannes
@guidot, another good reason to use numpy functions - they are vectorized. So you can apply such functions on the entire vector (matrix) without looping, which is usually order of magnitude faster.Grandnephew
FYI: For the normal clamp, you can use numpy.clip.Klaraklarika
@MaxU: The max function already accepts an iterable as first argument. What additional benefit provides vectorization?Rsfsr
@guidot, consider the following example: import numpy as np; a = np.random.rand(10**6); %timeit max(a); %timeit np.max(a); - on my PC the vectorized Numpy function took 127 times less time ;-)Grandnephew
@MaxU: While the factor was even 137 on my machine, the factor shrank to 45 if a simple list was given to built-in max instead of a numpy array. Interesting to know the numpy way for hard number crunching but the standard solution is not that awful.Rsfsr
I'm accepting my own answer since it got more upvotesMannes
I
2

As an option, if you want to make sure that there is a correspondence with the clamp function, you can convolve the normal clamp function with a smooth bell-like function such as Lorentzian or Gaussian.

This will guarantee the correspondence between the normal clamp function and its smoothed version. The smoothness itself will be defined by the underlying smooth function you choose to use in the convolution.

Iterate answered 4/12, 2022 at 11:47 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.