What does 'INFO:tensorflow:Oracle triggered exit' mean with keras tuner?
Asked Answered
A

10

8

When I run Keras Tuner search, the code runs for some epochs and then says: 'INFO:tensorflow:Oracle triggered exit'.

What does this mean? I am still able to extract best hyperparameters. Is it due to early stopping? I have tried both randomsearch and hyperband.

Abound answered 8/6, 2020 at 9:13 Comment(0)
H
7

You can solve this with:

tuner = RandomSearch(
    tune_rnn_model,
    objective='val_accuracy',
    seed=SEED, 
    overwrite=True,
    max_trials=MAX_TRIALS,
    directory='project')

To begin a new search and ignore any prior results, we set overwrite=True. Alternatively, you can delete the directory folder by using this code:

!rm -r <directory folder>
Higgler answered 10/7, 2022 at 6:34 Comment(0)
J
4

Probably the reason is, directory is already created.

Try following steps:

  1. Change the directory name.
  2. Restart the kernel.
  3. re-run all the codes.
Joelynn answered 14/4, 2021 at 14:11 Comment(1)
Welcome to SO! Unfortunately your answer doesn't add anything to the most voted one. Please edit it providing additional info and/or a code sample.Simeon
P
1

Try adding the directory argument where you have defined your tuner, or if you have already added directory arg, try changing the value of that arg. regard the last line in the below example of RandomSearch tuner:

tuner = RandomSearch(
    tune_rnn_model,
    objective='val_accuracy',
    seed=SEED,
    max_trials=MAX_TRIALS,
    directory='**change-this-value**',
)
Pavlov answered 14/6, 2020 at 10:19 Comment(2)
Tried changing it and still the same message, unfortunately.Abound
This did solve for me. For some reason I need to delete the directory folder it creates every time I want to run Random searchCilka
B
1

I solved this issue by setting these two conditions in my Tuner:

  • overwrite = False
  • a value for max_trials in the Oracle greater than the one I used until the error "Oracle triggered exit" occurred (I'm using kerastuner.oracles.BayesianOptimization Oracle)
Blindly answered 14/6, 2021 at 13:26 Comment(0)
D
1

I found the same issue and I found a very easy solution. It can be very easy if you just remove two files from the directory generated by the keras tunner. oracle.json and other .json files and Run it again it will work.

Descendant answered 1/12, 2021 at 8:28 Comment(0)
K
0

I believe this is occuring because you are working on a small dataset which is resulting in a large number of collisions while performing random search.

Please try reducing the number of 'max-trials' in your random-search, that may fix the issue.

Kirin answered 8/6, 2020 at 9:22 Comment(1)
the max trials is set to 1, and the dataset has 28 features, with 20 000 training instances and 6 000 validation instances. When I run hyperband search it runs some time before getting the message but when i run randomsearch i get the message instantlyAbound
C
0

I had the same issue with the Hyperband search.

For me the issue was solved by removing the "Early Stopping" callback from the tuner search.

Considerable answered 1/8, 2020 at 11:18 Comment(0)
P
0

For me i resolved this issue by removing the hp = HyperParameters() out of the build_model function. I mean, initialize the hp variable outside of the build model function.

Petulancy answered 11/6, 2021 at 19:24 Comment(0)
R
0

I had this issue because I named two hyperparameters with the same names.

E.g., in the build_model(hp) function I had:

def build_model(hp):
   ...
   a = hp.Choice('embedding_dim', [32, 64])
   b = hp.Choice('embedding_dim', [128, 256])
   ...

A final note is to be careful to have more hyperparameters' combinations that trials. In my example of build_model function I have 4 possible combination of hyperparameters (2*2), so max_trials <= 4.

I hope it will help someone.

Randa answered 15/2, 2022 at 13:33 Comment(0)
M
0

I had the same question and didn't find what I was looking for.

If the tuner have finished at trial, that is lower then your max_trial parameter, the most probable reason is that the tuner have already tried all the combinations possible for the field of hyperparameters that you set before.

Example: I have 2 parameters for tuner to try, fist can optain 8 values, second 18. If you multiply these two it gives you 144 combinations and that is exactly the number of trials that the tuner stopped at.

Magnetoelectricity answered 14/4, 2022 at 16:16 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.