How to use a CRF layer in Tensorflow 2 (using tfa.text)?
Asked Answered
A

1

14
model= Sequential()
model.add(keras.layers.Embedding(vocab_size,output_dim=100,input_length=input_len,weights=[embedding_matrix],trainable=False))
model.add(keras.layers.Bidirectional(keras.layers.LSTM(512, return_sequences=True,recurrent_dropout=0.2, dropout=0.2)))
model.add(keras.layers.Bidirectional(keras.layers.LSTM(512, return_sequences=True,recurrent_dropout=0.2, dropout=0.2)))

model.add(keras.layers.Dense(128, activation="relu"))

model.add(keras.layers.TimeDistributed(keras.layers.Dense(vocab_size_label, activation="softmax")))
model.compile(optimizer=optim,loss='sparse_categorical_crossentropy',metrics=["accuracy"])
model.summary()

I have built a Bi-lstm model for NER Tagging and now I want to introduce CRF layer in it. I am confused how can I insert CRF layer using Tensorflow

tfa.text.crf_log_likelihood(
    inputs,
    tag_indices,
    sequence_lengths,
    transition_params=None
)

I found this in tfa.txt and have 3 queries regarding this function: 1. How do I pass these arguments? 2. Do I have to use output of this as loss (negative of log_likelihood) in compiler. Can someone plz help me on this?

Ashely answered 9/12, 2019 at 21:17 Comment(0)
R
0

I am looking for this solution too, and I guess you should create a custom class to wrap the tfa.text.crf_log_likelihood method, and then integrate it in keras.Sequence.

Maybe something like https://github.com/tensorflow/addons/issues/723#issuecomment-559636561

Or even more pytorch-style, like this https://github.com/saiwaiyanyu/bi-lstm-crf-ner-tf2.0/blob/master/model.py

Romanism answered 6/1, 2021 at 10:13 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.