Julia Flux error: SGD optimiser is undefined
Asked Answered
E

1

7

I want to use the SGD optimiser in Flux as shown in the Julia Academy tutorial for Deep Learning with Flux.jl. This the notebook they provided in which they use the SGD optimizer as:

opt = SGD(params(model))

However when I run SGD I get:

ERROR: UndefVarError: SGD not defined

This is my output when I run ?SGD:

search: SGD AMSGrad Signed signed Unsigned unsigned sigmoid issetgid logsigmoid StringIndexError isassigned significand

Couldn't find SGD
Perhaps you meant SGD, Set, Sys, GC, Some, sec, sin, sum, LSTM, csc, esc, isa, ans, abs, cis, cos, eps, ARGS, Pkg, GRU, RNN, cpu, elu, f32, f64, gpu, σ, !, !=, !== or %
  No documentation found.

  Binding SGD does not exist.

As you can see it is still showing SGD in the Perhaps you meant line.

I do not get an error when I run other optimizers also shown in the tutorial such as ADAM. I am using Flux v0.10.0

Erotogenic answered 20/1, 2020 at 5:27 Comment(0)
E
6

The tutorial uses an outdated version of Flux.

In version v0.10.0 of Flux, Flux has deprecated usage of SGD in favor of Descent which is just a more optimised version of the Standard Gradient Descent algorithm.

More information on the Descent optimizer can be found in the documentation.

Also as a side-note, Flux no longer needs to pass params(model) into the optimizer instead it takes it as a separate argument when training.

# New Way
Flux.train!(loss, params(model), data, optimizer)
Erotogenic answered 20/1, 2020 at 5:32 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.