RMSE/ RMSLE loss function in Keras
Asked Answered
T

6

47

I try to participate in my first Kaggle competition where RMSLE is given as the required loss function. For I have found nothing how to implement this loss function I tried to settle for RMSE. I know this was part of Keras in the past, is there any way to use it in the latest version, maybe with a customized function via backend?

This is the NN I designed:

from keras.models import Sequential
from keras.layers.core import Dense , Dropout
from keras import regularizers

model = Sequential()
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu", input_dim = 28,activity_regularizer = regularizers.l2(0.01)))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu"))
model.add(Dropout(rate = 0.2))
model.add(Dense(units = 1, kernel_initializer = "uniform", activation = "relu"))
model.compile(optimizer = "rmsprop", loss = "root_mean_squared_error")#, metrics =["accuracy"])

model.fit(train_set, label_log, batch_size = 32, epochs = 50, validation_split = 0.15)

I tried a customized root_mean_squared_error function I found on GitHub but for all I know the syntax is not what is required. I think the y_true and the y_pred would have to be defined before passed to the return but I have no idea how exactly, I just started with programming in python and I am really not that good in math...

from keras import backend as K

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1)) 

I receive the following error with this function:

ValueError: ('Unknown loss function', ':root_mean_squared_error')

Thanks for your ideas, I appreciate every help!

Tonietonight answered 8/5, 2017 at 18:49 Comment(1)
The root_mean_squared_error you defined, seems equivalent to 'mse'(mean squared error) in keras. Just fyi.Kirimia
J
75

When you use a custom loss, you need to put it without quotes, as you pass the function object, not a string:

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true))) 

model.compile(optimizer = "rmsprop", loss = root_mean_squared_error, 
              metrics =["accuracy"])
Jacquesjacquet answered 9/5, 2017 at 7:34 Comment(9)
Works perfectly fine, thank you very much for pointing out that mistake. I really did not think about it that way as I am kind of new to programming. You would not know by any chance how to edit this custom function so that it computes the root mean square LOGARITHMIC error, would you?Tonietonight
It gives me Unknown loss function:root_mean_squared_errorAnnoyance
@Annoyance Please do not make such comments, make your own question with source code.Jacquesjacquet
@Annoyance You're probably putting quotes around the function's name. You need to pass the function object to the compile function, not its name.Radicalism
you mean metrics=['mse']?Mcabee
This code gives this same value as MAE, not RMSE (see answer belowe).Airglow
I just updated the answer, by setting axis=None (the default), it will take the mean over all dimensions.Jacquesjacquet
@Mcabee mse stands for Mean Square Error. The difference is taken and then squared, followed by taking the mean. This is different than RMSE (Root Mean Squared Error) because the square root is taken of the whole operation of the Mean Square Error.Lelia
You should always add the import import tensorflow.keras.backend as K (I added it to the answer)Morell
V
37

The accepted answer contains an error, which leads to that RMSE being actually MAE, as per the following issue:

https://github.com/keras-team/keras/issues/10706

The correct definition should be

def root_mean_squared_error(y_true, y_pred):
        return K.sqrt(K.mean(K.square(y_pred - y_true)))
Verla answered 24/10, 2018 at 8:12 Comment(1)
Thank you very much for this comment! I spent so much time trying to figure out why my RMSE results (using code above) are this same as MAE.Airglow
S
16

If you are using latest tensorflow nightly, although there is no RMSE in the documentation, there is a tf.keras.metrics.RootMeanSquaredError() in the source code.

sample usage:

model.compile(tf.compat.v1.train.GradientDescentOptimizer(learning_rate),
              loss=tf.keras.metrics.mean_squared_error,
              metrics=[tf.keras.metrics.RootMeanSquaredError(name='rmse')])
Schadenfreude answered 12/5, 2019 at 22:34 Comment(1)
I get an error when I try to use it as a loss function: AttributeError: 'RootMeanSquaredError' object has no attribute '__name__' even though I used the name parameter.Whatnot
B
7

I prefer reusing part of the Keras work

from keras.losses import mean_squared_error

def root_mean_squared_error(y_true, y_pred):
    return K.sqrt(mean_squared_error(y_true, y_pred))

model.compile(optimizer = "rmsprop", loss = root_mean_squared_error, 
          metrics =["accuracy"])
Bismuthinite answered 13/2, 2020 at 13:13 Comment(3)
One thing to note is that the manifold of this loss function may go to infinite (because of the square root) and the training can fail.Bismuthinite
I just tried this function and get this infinite loss ^_^Anniceannie
lol, yes, if at some point in the training the square root returns infinite all your training failsBismuthinite
I
4

Just like before, but more simplified (directly) version for RMSLE using Keras Backend:

import tensorflow as tf
import tensorflow.keras.backend as K

def root_mean_squared_log_error(y_true, y_pred):
    msle = tf.keras.losses.MeanSquaredLogarithmicError()
    return K.sqrt(msle(y_true, y_pred)) 
Inessential answered 17/12, 2020 at 5:14 Comment(1)
You may want to add more explain.Xeric
E
3

You can do RMSLE the same way RMSE is shown in the other answers, you just also need to incorporate the log function:

from tensorflow.keras import backend as K

def root_mean_squared_log_error(y_true, y_pred):
    return K.sqrt(K.mean(K.square(K.log(1+y_pred) - K.log(1+y_true))))
Environment answered 30/7, 2020 at 22:1 Comment(1)
note that y_pred and y_true need to be float values -> K.sqrt(K.mean(K.square(K.log(float(y_pred+1)) - K.log(float(y_true+1)))))Athos

© 2022 - 2024 — McMap. All rights reserved.