I’m running a classifier that is drawing data from two large datasets using two generators. I build one model and then train it in a loop that looks something like this:
myModelCheckpoint = ModelCheckpoint("dirname") for _ in range( nIterations ): x_train, y_train = getTrainingDataFromGenerators() model.fit( x_train, y_train, ... epochs=10, callbacks=[myModelCheckpoint ])
What I want is for ModelCheckpoint to fire on the single best model over all nIterations. But it seems like it resets and starts over for each model.fit(). I’ve seen a model get saved for a particular val_acc that is lower than the best val_acc of the previous model.fit().
Essentially I want a global ModelCheckpoint, not local to a particular model.fit(). Is that possible?
submitted by /u/Simusid
[visit reddit] [comments]