New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Online Learning in Keras? #1868
Comments
When you say online learning, do you mean this? |
Say, If I train a model with a training set. Example:
Now, when I do
Here does the above statement retrain the model from scratch or just the model parameters get updated. |
The parameters just get updated by .fit(), so it should be suitable for
you.
|
Okay, thank you! |
The parameters are not only getting updated but also the learned from initial training will be overwritten. The more new_data_input you have, the more the old training will be overwritten and it's probable that the accuracy for those training data goes down. I guess you mean with online learning, that with every new input all total inputs ever used are added to the network instead of overwritten, but I don't think this is the case. |
I'm not sure I follow exactly what Marc is trying to say (maybe he meant But it does bear to note that when you restart the fit(), any kind Fixing this (to set the initial #iterations > 0) will require some |
@ashwin123 : I am also looking for online learning in keras. What I understand is that: Also from your experience, how did your model perform ? |
Yes, I was updating the model with a new batch of data points. The model seemed to perform well. |
@ashwin123 Looking for some good practices of building models using online deep learning |
I have put up a basic code for Online Deep Learning in Keras. There is a difference in the outcome of offline and online - On test data ofline gave a 97.98 accuracy, online learning gave 93.96 accuracy. Is this a right way to implement online learning in Keras ? |
@anujgupta82 the link you gave is broken. Can you please give another link that works. It will be highly appreciated. Thanks |
@ashwin123 would you be willing to share your code for this issue? |
Thank you for the example code @anujgupta82. As one would assume, the training time for an online LSTM can be prohibitively slow. I would like to train my network on mini-batches, and test (run prediction) online. If anyone can help, please take a look at my question on SO. |
@fchollet |
# create model
...
...
# train model on available data
model.fit(.....)
...
# save model
model.save('yourfileName.h5') Some time passes from keras.models import load_model
model = load_model('yourfileName.h5')
# continue training
model.fit(...) |
Example flow from @patyork is only good for transfer learning, or weight initialization, not for online learning as optimizer's parameters (the huge learning rate, decay...) are restarted when model.fit(...). What I did is to set small learning rate, reasonable number of epoch and combine online training data and apart of previous training data to fit after loading the model. The experimental result is not too bad, but I don't think this is a good way to go. Any other approach to solve online deep learning ? |
@smhoang, |
|
@snowde are n't you just loading the model and compiling it ? Where are u adding new data ? |
@snowde @MLDSBigGuy How does this:
Differ from:
|
@toluwajosh here is the working link https://github.com/anujgupta82/DeepNets/blob/master/Online_Learning/Online_Learning_DeepNets.ipynb Might be somebody haven't found the repository yet |
Hi, according to online learning I believe the issue is something like this: For instance I believe that online learning keeps the weights of the hidden layers and "dumps" the input layers and the output layer (since it will be different) so the code I developed should help with online learning :) (Ps: X is a sparse matrix ) #para atualizar o modelo com novos dados é preciso: for layer in model.layers[1:-1]: model2.add(Dense(1, activation='sigmoid')) model2.compile(loss='binary_crossentropy', history=model2.fit( X , train_Y.target, Do you agree with this ideia as a way to update the model with volatile data (such is the case with Big Data) ? PS2: I saw two posts of this sorry for the repost all EDIT This usually happens when you 1 hot encode data |
Guys the link is perfectly working for me |
I suggest looking at this paper when building reusable neural networks as well |
So what’s the solution? |
Basically according to https://ieeexplore.ieee.org/abstract/document/8851888 paper (published in IJCNN conference), you can memorize the structure in a certain Point of time (in this case was in each U). If an input differs from the previous learnt structure you can basically assign the random Weight to that connection (between input and the first layer). If the Weight is known just assign the same weight learnt! |
I have a similar problem. What is the final solution for it? |
@dileepkumarg-sa are you able to get the solution? |
According to this SO thread as of March 1st 2021 the following is the case:
So online learning can be accomplished in keras, if you save the model and then load it you can just continue training it with the |
Maybe this tutorial is good https://www.pyimagesearch.com/2019/06/17/online-incremental-learning-with-keras-and-creme/ |
I wanted to implement online learning for a LSTM RNN. Does keras support online learning as of now?
If not, can someone direct us to any source on how online learning can be implemented for RNN? (atleast on a conceptual level).
The text was updated successfully, but these errors were encountered: