I am making a machine learning program for time series data analysis and using NEAT could help the work. I started to learn TensorFlow not long ago but it seems that the computational graphs in TensorFlow are usually fixed. Is there tools in TensorFlow to help build a dynamically evolving neural network? Or something like Pytorch would be a better alternative? Thanks.
One way to make an evolving tensorflow network would be to use either hyperneat or the es-hyperneat algorithms instead of running the evolution on the individual networks in the species this instead evolves a "genome" that is actually cppn that encodes the phenotype neural nets. For the cppn you can use a feed forward tensorflow network with the caveat of having different activation functions that can be used at each node, this lets the cppn evolve to be able to be queried for the structure and weights of the "phenotype" neural network for which you can use a generic tensorflow net(or whatever net you so choose)
I would look into the neat-python and peas libraries and look at the networks they use and replicate those classes with tensorflow nets.
It can't be implemented in the static graph mode of TensorFlow without significant tradeoffs because the topology of the neural networks in the population changes. Static graphs are suited for models whose architecture doesn't change during training. However, it can be done in TensorFlow Eager or PyTorch because they support dynamic computation graphs.
Check this implementation in TensorFlow Eager: https://github.com/crisbodnar/TensorFlow-NEAT
TensorFlow supports eager execution which can support arbitrarily dynamic network topologies.
© 2022 - 2024 — McMap. All rights reserved.