TensorFlow 2.0 Tutorial: 3

1. simple RNN

下面創建一個簡單的 2 層 RNN,每層有 100 個神經元,輸出層是單個神經元的 dense 層:

model1 = keras.models.Sequential()
model1.add(keras.layers.SimpleRNN(100, return_sequences=True, input_shape=input_shape))
model1.add(keras.layers.SimpleRNN(100))
model1.add(keras.layers.Dense(1))
model1.compile(loss="mse", optimizer=keras.optimizers.SGD(lr=0.005), metrics=["mae"])
history1 = model1.fit(X_train_3D, y_train, epochs=200, batch_size=200,
validation_data=(X_valid_3D, y_valid))

2. Seq2Seq

建立一個 Seq2Seq 模型,和簡單 RNN的區別是,第二個 RNN 層也用了 return sequences=True

model2 = keras.models.Sequential()
model2.add(keras.layers.SimpleRNN(100, return_sequences=True, input_shape=input_shape))
model2.add(keras.layers.SimpleRNN(100, return_sequences=True))
model2.add(keras.layers.Dense(1))
model2.compile(loss=huber_loss, optimizer=keras.optimizers.SGD(lr=0.01),
metrics=[mae_last_step])
history2 = model2.fit(X_train_3D, Y_train_3D, epochs=200, batch_size=200,
validation_data=(X_valid_3D, Y_valid_3D))

3. Seq2Seq LSTM

將普通 rnn 層換成 lstm 層

model3 = keras.models.Sequential()
model3.add(keras.layers.LSTM(100, return_sequences=True, input_shape=input_shape))
model3.add(keras.layers.LSTM(100, return_sequences=True))
model3.add(keras.layers.Dense(1))
model3.compile(loss=huber_loss, optimizer=keras.optimizers.SGD(lr=0.01),
metrics=[mae_last_step])
history3 = model3.fit(X_train_3D, Y_train_3D, epochs=200, batch_size=200,
validation_data=(X_valid_3D, Y_valid_3D),
callbacks=[keras.callbacks.ReduceLROnPlateau(verbose=1)])

4. 1D-ConvNets 預處理

在前面加上一個 1D-ConvNets 預處理

model4 = keras.models.Sequential()
model4.add(keras.layers.Conv1D(32, kernel_size=5, input_shape=input_shape))
model4.add(keras.layers.MaxPool1D(pool_size=5, strides=2))
model4.add(keras.layers.LSTM(32, return_sequences=True))
model4.add(keras.layers.LSTM(32))
model4.add(keras.layers.Dense(1))
model4.compile(loss=huber_loss, optimizer=keras.optimizers.SGD(lr=0.005))
history4 = model4.fit(X_train_3D, y_train, epochs=200, batch_size=100,
validation_data=(X_valid_3D, y_valid))

大家好!

我是 不會停的蝸牛 Alice,

喜歡人工智能,每天寫點機器學習乾貨,


分享到:


相關文章: