作者taylor0607 (加菲猫星人)
看板DataScience
标题[问题] LSTM预测问题
时间Fri Jun 1 00:17:12 2018
各位大大好~
小弟学过DM的一些方法
正在尝试Keras预测
目前遇到的问题是
我想用25个栏位去预测Y值
传统Dm的方法是,将新的资料吃进训练好的模型中,让他预测新的Y
但在Keras里,他预测出来的y是array形式
想请问这样的话,我要如何将新的资料转成训练模型可吃的形式,谢谢!
以下是我的程式码:
import numpy
import pandas as pd
from sklearn import preprocessing
numpy.random.seed(10)
all_df = pd.read_csv("/Users/mac/Desktop/123.csv")
cols=['x1','x2','x3','x4','x5','x6','x7','x8','x9','x10',
'x11','x12','x13','x14','x15','x16','x17','x18','x19',
'x20','x21','x22','x23','x24','x25'] #栏位名称
all_df = all_df[cols]
msk=numpy.random.rand(len(all_df)) < 0.8
train_df = all_df[msk]
test_df=all_df[~msk]
train_Features=all_df[['x1','x2','x3','x4','x5','x6','x7','x8','x9',
'x10','x11','x12','x13','x14','x15','x16','x17','x18','x19',
'x20','x21','x22','x23','x24','x25']]
train_Label = all_df['x25']
test_Features=all_df[['x1','x2','x3','x4','x5',
'x6','x7','x8','x9','x10','x11','x12','x13','x14',
'x15','x16','x17','x18','x19','x20','x21','x22','x23','x24','x25']]
test_Label = all_df['x25']
print(len(train_df))
print(len(test_df))
print(len(all_df))
from keras.models import Sequential
from keras.layers import Dense,Dropout
model = Sequential()
model.add(Dense(units=40,input_dim=25,
kernel_initializer= 'uniform',
activation = 'relu'))
model.add(Dense(units=30,
kernel_initializer= 'uniform',
activation = 'relu'))
model.add(Dense(units=1,
kernel_initializer= 'uniform',
activation = 'sigmoid'))
model.compile(loss='binary_crossentropy',
optimizer = 'adam',metrics=['accuracy'])
train_history = model.fit(x=train_Features,
y=train_Label,
validation_split=0.1,
epochs=30,
batch_size=30,verbose=2)
train_history
scores = model.evaluate(x=test_Features,y=test_Label)
scores[1]
以下是我的模型结果:
runfile('/Users/mac/.spyder-py3/temp.py', wdir='/Users/mac/.spyder-py3')
74
16
90
Train on 81 samples, validate on 9 samples
Epoch 1/30
- 1s - loss: 0.6929 - acc: 0.4198 - val_loss: 0.6937 - val_acc: 0.1111
Epoch 2/30
- 0s - loss: 0.6902 - acc: 0.1852 - val_loss: 0.6944 - val_acc: 0.1111
Epoch 3/30
- 0s - loss: 0.6877 - acc: 0.1605 - val_loss: 0.6951 - val_acc: 0.1111
Epoch 4/30
- 0s - loss: 0.6851 - acc: 0.1605 - val_loss: 0.6957 - val_acc: 0.1111
Epoch 5/30
- 0s - loss: 0.6813 - acc: 0.1605 - val_loss: 0.6963 - val_acc: 0.1111
Epoch 6/30
- 0s - loss: 0.6767 - acc: 0.1852 - val_loss: 0.6970 - val_acc: 0.1111
Epoch 7/30
- 0s - loss: 0.6708 - acc: 0.2099 - val_loss: 0.6975 - val_acc: 0.1111
Epoch 8/30
- 0s - loss: 0.6628 - acc: 0.2222 - val_loss: 0.6979 - val_acc: 0.1111
Epoch 9/30
- 0s - loss: 0.6534 - acc: 0.3210 - val_loss: 0.6984 - val_acc: 0.1111
Epoch 10/30
- 0s - loss: 0.6397 - acc: 0.3580 - val_loss: 0.6986 - val_acc: 0.2222
Epoch 11/30
- 0s - loss: 0.6244 - acc: 0.4321 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 12/30
- 0s - loss: 0.6039 - acc: 0.4815 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 13/30
- 0s - loss: 0.5758 - acc: 0.5309 - val_loss: 0.6988 - val_acc: 0.2222
Epoch 14/30
- 0s - loss: 0.5467 - acc: 0.5432 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 15/30
- 0s - loss: 0.5088 - acc: 0.5432 - val_loss: 0.6991 - val_acc: 0.2222
Epoch 16/30
- 0s - loss: 0.4600 - acc: 0.5432 - val_loss: 0.6986 - val_acc: 0.3333
Epoch 17/30
- 0s - loss: 0.4149 - acc: 0.5556 - val_loss: 0.6988 - val_acc: 0.3333
Epoch 18/30
- 0s - loss: 0.3513 - acc: 0.5679 - val_loss: 0.6993 - val_acc: 0.4444
Epoch 19/30
- 0s - loss: 0.2774 - acc: 0.5556 - val_loss: 0.6992 - val_acc: 0.4444
Epoch 20/30
- 0s - loss: 0.2010 - acc: 0.5556 - val_loss: 0.7004 - val_acc: 0.4444
Epoch 21/30
- 0s - loss: 0.1163 - acc: 0.5556 - val_loss: 0.7034 - val_acc: 0.4444
Epoch 22/30
- 0s - loss: 0.0139 - acc: 0.5556 - val_loss: 0.7056 - val_acc: 0.4444
Epoch 23/30
- 0s - loss: -8.1930e-02 - acc: 0.5679 - val_loss: 0.7121 - val_acc: 0.4444
Epoch 24/30
- 0s - loss: -1.9559e-01 - acc: 0.5679 - val_loss: 0.7214 - val_acc: 0.4444
Epoch 25/30
- 0s - loss: -3.2348e-01 - acc: 0.5679 - val_loss: 0.7327 - val_acc: 0.4444
Epoch 26/30
- 0s - loss: -4.4836e-01 - acc: 0.5802 - val_loss: 0.7467 - val_acc: 0.4444
Epoch 27/30
- 0s - loss: -5.7915e-01 - acc: 0.5802 - val_loss: 0.7694 - val_acc: 0.4444
Epoch 28/30
- 0s - loss: -7.3865e-01 - acc: 0.5802 - val_loss: 0.7944 - val_acc: 0.4444
Epoch 29/30
- 0s - loss: -8.9148e-01 - acc: 0.5802 - val_loss: 0.8236 - val_acc: 0.4444
Epoch 30/30
- 0s - loss: -1.0620e+00 - acc: 0.5802 - val_loss: 0.8666 - val_acc: 0.4444
90/90 [==============================] - 0s 49us/step
--
※ 发信站: 批踢踢实业坊(ptt.cc), 来自: 119.14.41.117
※ 文章网址: https://webptt.com/cn.aspx?n=bbs/DataScience/M.1527783435.A.9A9.html
1F:推 HYDE1986: 你是模型已经train好,要用新的资料来进行预测吗?06/01 09:18
2F:推 ax61316: 你的LSTM预测的结果长怎样?能否秀出程式?06/01 11:02
3F:推 tsoahans: 你要做的事many-to-one还是many-to-many的预测?06/01 13:56
4F:→ tsoahans: 是06/01 13:57
※ 编辑: taylor0607 (27.246.68.149), 06/01/2018 15:04:55
5F:→ taylor0607: 好的 补上了 06/01 15:05
6F:→ taylor0607: 对 我想像DM一样 预测出一个新栏位06/01 15:05
7F:→ taylor0607: 回t大 我是many to one 06/01 15:06
8F:→ Kazimir: 我怎麽没看到LSTM在哪里 你import 的不是Dense吗06/01 16:42
※ 编辑: taylor0607 (27.246.68.149), 06/01/2018 17:04:09
9F:→ taylor0607: 啊抱歉 是Keras 06/01 17:04
10F:推 Kazimir: 你模型训练时是怎样测试就怎样进去阿 scores没东西吗? 06/01 17:18
11F:→ tsoahans: 你要问的是怎麽predict test data吗? 06/01 17:54
12F:推 HYDE1986: Keras直接用model.predict就可以了呀 官方文件有参数说 06/01 17:58
13F:→ HYDE1986: 明 06/01 17:58
15F:→ taylor0607: 好 谢谢~ 06/01 19:15