I’ve experimented with Scikit-learn’s MLPRegressor class and have seen that it does fairly well for the dataset I’m looking at without much tuning. Here’s what I’ve been using so far:
from sklearn.neural_network import MLPRegressor from sklearn.pipeline import Pipeline from sklearn.compose import TransformedTargetRegressor from sklearn.preprocessing import StandardScaler base = MLPRegressor(max_iter=50, hidden_layer_sizes=(100,), early_stopping=True, learning_rate="adaptive") pipeline = Pipeline([('scaler', StandardScaler()), ('model', base)]) model = TransformedTargetRegressor(regressor=pipeline, transformer=StandardScaler())
What I’d like to do is implement something virtually identical in Tensorflow as a stepping stone to making a more complicated model with separate LSTM and Dense channels. Here’s what I have so far:
from tensorflow.keras.layers import Input, Dense from tensorflow.keras.models import Model from tensorflow.keras.callbacks import EarlyStopping from tensorflow.keras.wrappers.scikit_learn import KerasRegressor def simple_tf_model(): dense_input = Input(shape=(train_X.shape[1],)) dense = Dense(100, activation="relu")(dense_input) dense = Dense(1)(dense) tf_model = Model(inputs=[dense_input], outputs=dense) tf_model.compile(loss="mse",optimizer="adam") return tf_model # monitor='val_loss' with validation_split=0.1 based on the early_stopping and validation_fraction parameters of MLPRegressor es = EarlyStopping(monitor='val_loss', mode='min', verbose=1) # batch_size=200 used as equivalent to batch_size="auto" for MLPRegressor tf_model = KerasRegressor(build_fn=simple_tf_model, batch_size=200, epochs=50, validation_split=0.1, callbacks=[es]) pipeline = Pipeline([('scaler', StandardScaler()), ('model', tf_model)]) model = TransformedTargetRegressor(regressor=pipeline, transformer=StandardScaler()) model.fit(train_X, train_y)
However, when I run this, I get noticeably better performance from the MLPRegressor model than from the Tensorflow version.
Am I doing something wrong in the Tensorflow implementation?
submitted by /u/JHogg11
[visit reddit] [comments]