PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測_第1頁
PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測_第2頁
PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測_第3頁
PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測_第4頁
PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測_第5頁
已閱讀5頁,還剩1頁未讀 繼續(xù)免費閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)

文檔簡介

第PyTorch搭建LSTM實現(xiàn)多變量多步長時序負(fù)荷預(yù)測目錄I.前言II.數(shù)據(jù)處理III.LSTM模型IV.訓(xùn)練和預(yù)測V.源碼及數(shù)據(jù)

I.前言

在前面的兩篇文章PyTorch搭建LSTM實現(xiàn)時間序列預(yù)測(負(fù)荷預(yù)測)和PyTorch搭建LSTM實現(xiàn)多變量時間序列預(yù)測(負(fù)荷預(yù)測)中,我們利用LSTM分別實現(xiàn)了單變量單步長時間序列預(yù)測和多變量單步長時間序列預(yù)測。

本篇文章主要考慮用PyTorch搭建LSTM實現(xiàn)多變量多步長時間序列預(yù)測。

系列文章:

PyTorch搭建雙向LSTM實現(xiàn)時間序列負(fù)荷預(yù)測

PyTorch搭建LSTM實現(xiàn)多變量時序負(fù)荷預(yù)測

PyTorch深度學(xué)習(xí)LSTM從input輸入到Linear輸出

PyTorch搭建LSTM實現(xiàn)時間序列負(fù)荷預(yù)測

II.數(shù)據(jù)處理

數(shù)據(jù)集為某個地區(qū)某段時間內(nèi)的電力負(fù)荷數(shù)據(jù),除了負(fù)荷以外,還包括溫度、濕度等信息。

本文中,我們根據(jù)前24個時刻的負(fù)荷以及該時刻的環(huán)境變量來預(yù)測接下來4個時刻的負(fù)荷(步長可調(diào))。

defload_data(file_name):

globalMAX,MIN

df=pd.read_csv(os.path.dirname(os.getcwd())+'/data/new_data/'+file_name,encoding='gbk')

columns=df.columns

df.fillna(df.mean(),inplace=True)

MAX=np.max(df[columns[1]])

MIN=np.min(df[columns[1]])

df[columns[1]]=(df[columns[1]]-MIN)/(MAX-MIN)

returndf

classMyDataset(Dataset):

def__init__(self,data):

self.data=data

def__getitem__(self,item):

returnself.data[item]

def__len__(self):

returnlen(self.data)

defnn_seq(file_name,B,num):

print('dataprocessing...')

data=load_data(file_name)

load=data[data.columns[1]]

load=load.tolist()

data=data.values.tolist()

seq=[]

foriinrange(0,len(data)-24-num,num):

train_seq=[]

train_label=[]

forjinrange(i,i+24):

x=[load[j]]

forcinrange(2,8):

x.append(data[j][c])

train_seq.append(x)

forjinrange(i+24,i+24+num):

train_label.append(load[j])

train_seq=torch.FloatTensor(train_seq)

train_label=torch.FloatTensor(train_label).view(-1)

seq.append((train_seq,train_label))

#print(seq[-1])

Dtr=seq[0:int(len(seq)*0.7)]

Dte=seq[int(len(seq)*0.7):len(seq)]

train_len=int(len(Dtr)/B)*B

test_len=int(len(Dte)/B)*B

Dtr,Dte=Dtr[:train_len],Dte[:test_len]

train=MyDataset(Dtr)

test=MyDataset(Dte)

Dtr=DataLoader(dataset=train,batch_size=B,shuffle=False,num_workers=0)

Dte=DataLoader(dataset=test,batch_size=B,shuffle=False,num_workers=0)

returnDtr,Dte

其中num表示需要預(yù)測的步長,如num=4表示預(yù)測接下來4個時刻的負(fù)荷。

任意輸出其中一條數(shù)據(jù):

(tensor([[0.5830,1.0000,0.9091,0.6957,0.8333,0.4884,0.5122],

[0.6215,1.0000,0.9091,0.7391,0.8333,0.4884,0.5122],

[0.5954,1.0000,0.9091,0.7826,0.8333,0.4884,0.5122],

[0.5391,1.0000,0.9091,0.8261,0.8333,0.4884,0.5122],

[0.5351,1.0000,0.9091,0.8696,0.8333,0.4884,0.5122],

[0.5169,1.0000,0.9091,0.9130,0.8333,0.4884,0.5122],

[0.4694,1.0000,0.9091,0.9565,0.8333,0.4884,0.5122],

[0.4489,1.0000,0.9091,1.0000,0.8333,0.4884,0.5122],

[0.4885,1.0000,0.9091,0.0000,1.0000,0.3256,0.3902],

[0.4612,1.0000,0.9091,0.0435,1.0000,0.3256,0.3902],

[0.4229,1.0000,0.9091,0.0870,1.0000,0.3256,0.3902],

[0.4173,1.0000,0.9091,0.1304,1.0000,0.3256,0.3902],

[0.4503,1.0000,0.9091,0.1739,1.0000,0.3256,0.3902],

[0.4502,1.0000,0.9091,0.2174,1.0000,0.3256,0.3902],

[0.5426,1.0000,0.9091,0.2609,1.0000,0.3256,0.3902],

[0.5579,1.0000,0.9091,0.3043,1.0000,0.3256,0.3902],

[0.6035,1.0000,0.9091,0.3478,1.0000,0.3256,0.3902],

[0.6540,1.0000,0.9091,0.3913,1.0000,0.3256,0.3902],

[0.6181,1.0000,0.9091,0.4348,1.0000,0.3256,0.3902],

[0.6334,1.0000,0.9091,0.4783,1.0000,0.3256,0.3902],

[0.6297,1.0000,0.9091,0.5217,1.0000,0.3256,0.3902],

[0.5610,1.0000,0.9091,0.5652,1.0000,0.3256,0.3902],

[0.5957,1.0000,0.9091,0.6087,1.0000,0.3256,0.3902],

[0.6427,1.0000,0.9091,0.6522,1.0000,0.3256,0.3902]]),tensor([0.6360,0.6996,0.6889,0.6434]))

數(shù)據(jù)格式為(X,Y)。其中X一共24行,表示前24個時刻的負(fù)荷值和該時刻的環(huán)境變量。Y一共四個值,表示需要預(yù)測的四個負(fù)荷值。需要注意的是,此時input_size=7,output_size=4。

III.LSTM模型

這里采用了深入理解PyTorch中LSTM的輸入和輸出(從input輸入到Linear輸出)中的模型:

classLSTM(nn.Module):

def__init__(self,input_size,hidden_size,num_layers,output_size,batch_size):

super().__init__()

self.input_size=input_size

self.hidden_size=hidden_size

self.num_layers=num_layers

self.output_size=output_size

self.num_directions=1

self.batch_size=batch_size

self.lstm=nn.LSTM(self.input_size,self.hidden_size,self.num_layers,batch_first=True)

self.linear=nn.Linear(self.hidden_size,self.output_size)

defforward(self,input_seq):

h_0=torch.randn(self.num_directions*self.num_layers,self.batch_size,self.hidden_size).to(device)

c_0=torch.randn(self.num_directions*self.num_layers,self.batch_size,self.hidden_size).to(device)

#print(input_seq.size())

seq_len=input_seq.shape[1]

#input(batch_size,seq_len,input_size)

input_seq=input_seq.view(self.batch_size,seq_len,self.input_size)

#output(batch_size,seq_len,num_directions*hidden_size)

output,_=self.lstm(input_seq,(h_0,c_0))

#print('output.size=',output.size())

#print(self.batch_size*seq_len,self.hidden_size)

output=output.contiguous().view(self.batch_size*seq_len,self.hidden_size)#(5*30,64)

pred=self.linear(output)#pred()

#print('pred=',pred.shape)

pred=pred.view(self.batch_size,seq_len,-1)

pred=p

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論