Matrix Factorization in tensorflow 2.0 using WALS Method
Asked Answered
S

4

3

I am using WALS method in order to perform matrix factorization. Initially in tensorflow 1.13 I can import factorization_ops using

from tensorflow.contrib.factorization.python.ops import factorization_ops 

As described in the documentation

Wals model can be called from factorization_ops by using

factorization_ops.WALSModel

Using same command in tensorflow 2.0 giving me following error

ModuleNotFoundError: No module named 'tensorflow.contrib.factorization

Going through the issue there appears to be no way out to use WALSModel in tensorflow 2.0+.

Also it has been mentioned here in tensorflow release updates that tf.contrib has been deprecated, and functionality has been either migrated to the core TensorFlow API, to an ecosystem project such as tensorflow/addons or tensorflow/io, or removed entirely.

How can I use WALS model in tensorflow 2.0 (Currently I am using 2.0.0-rc0 on windows machine) ? Is WALSModel has been removed or I am missing out some information ?

Sukiyaki answered 12/9, 2019 at 8:9 Comment(0)
P
4

I believe WALS is not supported in TF 2.0 ...The official recommendation model is Neural Collaborative Filter (NCF)

I hope this helps.

M

Poltergeist answered 24/9, 2019 at 13:21 Comment(1)
Thanks this will be helpful in other way :)Sukiyaki
M
2

I have the same issue, but I don't really have time to write a library myself unfortunately. There are several potential options that I am considering:

  1. Stick with TF1.X until someone creates a library

  2. Switch to using lightfm to continue using WALS

  3. Switch to neural collaborative filtering using embedding layers with keras and a dot product layer. See this paper https://arxiv.org/abs/1708.05031, and this code implementation:

from tensorflow.keras.layers import Input, Embedding, Flatten, Dot, Dense
from tensorflow.keras.models import Model
#import tensorflow.distribute

def get_compiled_model(n_users, n_items, embedding_dims=20):
    # Product embedding
    prod_input = Input(shape=[1], name="Item-Input")
    prod_embedding = Embedding(n_items+1, embedding_dims, name="Item-Embedding")(prod_input)
    prod_vec = Flatten(name="Flatten-Product")(prod_embedding)

    # User embedding
    user_input = Input(shape=[1], name="User-Input")
    user_embedding = Embedding(n_users+1, embedding_dims, name="User-Embedding")(user_input)
    user_vec = Flatten(name="Flatten-Users")(user_embedding)

    # The output is the dot product of the two, i.e. a one-hot vector
    dot_product = Dot(name="Dot-Product", axes=1)([prod_vec, user_vec])

    # compile - uncomment these two lines to make training distributed
    # dist_strat = distribute.Strategy()
    # with dist_strat.scope():
    model = Model(inputs = [user_input, prod_input], outputs = dot_product)
    model.compile(
        optimizer='adam',
        loss='mean_squared_error'
    )
    return model


Myeshamyhre answered 24/9, 2019 at 11:13 Comment(0)
U
2

I have compared the Tensorflow implementation of WALS to other implementations with respect to compute resources and accuracy (https://github.com/gtsoukas/cfzoo). The comparison suggests that the implicit Python package (https://github.com/benfred/implicit) is a good replacement that delivers superior performance.

Ullage answered 6/12, 2019 at 15:24 Comment(0)
G
0

I think that WALS has been removed. As part of tf.contrib it is not supported by TF2 and I do not think it fits in any of core or sub-projects. Your best bet is probably to make it available as a third-party library.

I expect to use it for my project, but the need to re-write it (mainly copy what was in TF1 and make it work as a separate library compatible with TF2) reduce the priority of this task...

Let us know if you start to code something. Thanks.

Alexis.

Greylag answered 23/9, 2019 at 7:7 Comment(1)
Yes, I am thinking to write this in case they will discontinue that as I have deployed one in TF 1.13. There is a bug opened for the same. Waiting for their final declaration. Seems like we are on same page. You want to collaborate ?Sukiyaki

© 2022 - 2024 — McMap. All rights reserved.