熱線電話:13121318867

登錄
首頁精彩閱讀Python使用numpy實現BP神經網絡
Python使用numpy實現BP神經網絡
2018-07-27
收藏

Python使用numpy實現BP神經網絡

本文完全利用numpy實現一個簡單的BP神經網絡,由于是做regression而不是classification,因此在這里輸出層選取的激勵函數就是f(x)=x。BP神經網絡的具體原理此處不再介紹。
 

   import numpy as np
     
    class NeuralNetwork(object):
        def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate):
            # Set number of nodes in input, hidden and output layers.設定輸入層、隱藏層和輸出層的node數目
            self.input_nodes = input_nodes
            self.hidden_nodes = hidden_nodes
            self.output_nodes = output_nodes
     
            # Initialize weights,初始化權重和學習速率
            self.weights_input_to_hidden = np.random.normal(0.0, self.hidden_nodes**-0.5,
                                           ( self.hidden_nodes, self.input_nodes))
     
            self.weights_hidden_to_output = np.random.normal(0.0, self.output_nodes**-0.5,
                                           (self.output_nodes, self.hidden_nodes))
            self.lr = learning_rate
            
            # 隱藏層的激勵函數為sigmoid函數,Activation function is the sigmoid function
            self.activation_function = (lambda x: 1/(1 + np.exp(-x)))
        
        def train(self, inputs_list, targets_list):
            # Convert inputs list to 2d array
            inputs = np.array(inputs_list, ndmin=2).T   # 輸入向量的shape為 [feature_diemension, 1]
            targets = np.array(targets_list, ndmin=2).T  
     
            # 向前傳播,Forward pass
            # TODO: Hidden layer
            hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # signals into hidden layer
            hidden_outputs =  self.activation_function(hidden_inputs)  # signals from hidden layer
     
            
            # 輸出層,輸出層的激勵函數就是 y = x
            final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # signals into final output layer
            final_outputs = final_inputs # signals from final output layer
            
            ### 反向傳播 Backward pass,使用梯度下降對權重進行更新 ###
            
            # 輸出誤差
            # Output layer error is the difference between desired target and actual output.
            output_errors = (targets_list-final_outputs)
     
            # 反向傳播誤差 Backpropagated error
            # errors propagated to the hidden layer
            hidden_errors = np.dot(output_errors, self.weights_hidden_to_output)*(hidden_outputs*(1-hidden_outputs)).T
     
            # 更新權重 Update the weights
            # 更新隱藏層與輸出層之間的權重 update hidden-to-output weights with gradient descent step
            self.weights_hidden_to_output += output_errors * hidden_outputs.T * self.lr
            # 更新輸入層與隱藏層之間的權重 update input-to-hidden weights with gradient descent step
            self.weights_input_to_hidden += (inputs * hidden_errors * self.lr).T
     
        # 進行預測    
        def run(self, inputs_list):
            # Run a forward pass through the network
            inputs = np.array(inputs_list, ndmin=2).T
            
            #### 實現向前傳播 Implement the forward pass here ####
            # 隱藏層 Hidden layer
            hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # signals into hidden layer
            hidden_outputs = self.activation_function(hidden_inputs) # signals from hidden layer
            
            # 輸出層 Output layer
            final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # signals into final output layer
            final_outputs = final_inputs # signals from final output layer
            
            return final_outputs

數據分析咨詢請掃描二維碼

若不方便掃碼,搜微信號:CDAshujufenxi

數據分析師資訊
更多

OK
客服在線
立即咨詢
日韩人妻系列无码专区视频,先锋高清无码,无码免费视欧非,国精产品一区一区三区无码
客服在線
立即咨詢